Monkey see, monkey touch, monkey do: Influence of visual and tactile input on the fronto-parietal grasping network
von Daniela Buchwald
Datum der mündl. Prüfung:2020-03-13
Betreuer:Prof. Dr. Hansjörg Scherberger
Gutachter:Prof. Dr. Hansjörg Scherberger
Gutachter:Dr. Igor Kagan
EnglischOne of the most common movements we do every day is grasping. Usually we do this with aid of our eyes, but other senses, like our sense of touch, can also assist in the preparation and execution of grasping movements. More so, constant feedback is essential when interacting with objects in order to react and adapt to changing situations, such as object slippage. While it is generally assumed that different brain areas are responsible for different types of input or tasks (although there is no perfect separation, most areas process different types of information) each movement is a team effort of many different areas. These interactions between brain areas in order to generate movements have not yet been extensively studied, especially not when the object information is delivered by different senses. In this thesis I investigated how tactile input is processed by different brain areas and how this information is used to plan and generate grasping movements. More so, I studied whether or not the planning and generation of grasping movements in the brain differs when based on visual compared to tactile information. For this purpose multi-electrode arrays were implanted into the primary motor cortex (M1), primary somatosensory cortex (S1), anterior intraparietal area (AIP) and the hand area of the ventral premotor cortex (area F5) of a rhesus macaque (Macaca mulatta). The animal was trained to grasp objects that he either saw or touched beforehand, allowing me to compare how information from both conditions is processed in the brain. By comparing firing rates, differences in the brain activity between both conditions were found. When looking at the number of significantly tuned neurons, it is obvious that there are differences in the way the brain plans grasping movements on the basis of vision versus touch, while the actual execution does not differ greatly. When trying to decode which trials were visually or tactually guided, accuracy is best in early memory but differentiation between both conditions is still possible shortly before grasping. In a second experiment, passive finger stimulation was applied to the middle finger of the animal, giving some insight in how tactile information is processed in the four brain areas. Together this demonstrates the influence of tactile input on the fronto-parietal grasping circuit. AIP, an area known to process visual object information and F5, known for grasp preparation, both show no reaction to tactile input in the absence of grasp intentions. More importantly, all four areas show significant differences in memory activity between visual and tactile grasps, enough to be able to decode both conditions from neuronal activity, showing that considerations how object information was acquired might be needed when trying to record from the fronto-parietal grasping network in order to control prosthetic robot arms.
Keywords: Grasping; Somatosensory cortex; Motor cortex; Parietal cortex; Premotor cortex; macaque monkey; multi-electrode recording; object recognition