Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Human Left Ventral Premotor Cortex Mediates Matching of Hand Posture to Object Use

  • Guy Vingerhoets ,

    guy.vingerhoets@ugent.be

    Affiliations Department of Experimental Psychology, Ghent University, Ghent, Belgium, Ghent Institute for Functional and Metabolic Imaging, Ghent University, Ghent, Belgium

  • Jo Nys,

    Affiliations Department of Experimental Psychology, Ghent University, Ghent, Belgium, Ghent Institute for Functional and Metabolic Imaging, Ghent University, Ghent, Belgium

  • Pieterjan Honoré,

    Affiliation Ghent Institute for Functional and Metabolic Imaging, Ghent University, Ghent, Belgium

  • Elisabeth Vandekerckhove,

    Affiliation Ghent Institute for Functional and Metabolic Imaging, Ghent University, Ghent, Belgium

  • Pieter Vandemaele

    Affiliations Ghent Institute for Functional and Metabolic Imaging, Ghent University, Ghent, Belgium, Department of Radiology, Ghent University, Ghent, Belgium

Human Left Ventral Premotor Cortex Mediates Matching of Hand Posture to Object Use

  • Guy Vingerhoets, 
  • Jo Nys, 
  • Pieterjan Honoré, 
  • Elisabeth Vandekerckhove, 
  • Pieter Vandemaele
PLOS
x

Abstract

Visuomotor transformations for grasping have been associated with a fronto-parietal network in the monkey brain. The human homologue of the parietal monkey region (AIP) has been identified as the anterior part of the intraparietal sulcus (aIPS), whereas the putative human equivalent of the monkey frontal region (F5) is located in the ventral part of the premotor cortex (vPMC). Results from animal studies suggest that monkey F5 is involved in the selection of appropriate hand postures relative to the constraints of the task. In humans, the functional roles of aIPS and vPMC appear to be more complex and the relative contribution of each region to grasp selection remains uncertain. The present study aimed to identify modulation in brain areas sensitive to the difficulty level of tool object - hand posture matching. Seventeen healthy right handed participants underwent fMRI while observing pictures of familiar tool objects followed by pictures of hand postures. The task was to decide whether the hand posture matched the functional use of the previously shown object. Conditions were manipulated for level of difficulty. Compared to a picture matching control task, the tool object – hand posture matching conditions conjointly showed increased modulation in several left hemispheric regions of the superior and inferior parietal lobules (including aIPS), the middle occipital gyrus, and the inferior temporal gyrus. Comparison of hard versus easy conditions selectively modulated the left inferior frontal gyrus with peak activity located in its opercular part (Brodmann area (BA) 44). We suggest that in the human brain, vPMC/BA44 is involved in the matching of hand posture configurations in accordance with visual and functional demands.

Introduction

In humans, goal based object-related movements play a significant role in our every day lives. These complex movements are composed of several components such as the reach, the grasp, and the manipulation part of the action, that, in concert, will contribute to the desired goal directed movement. Evidence is accumulating that the neural network underlying transitive movements is very complex, and that different movement components may be subserved by different neural regions [1][4]. In this study we will focus on the grasp part of the action, more in particular on the selection of the proper hand posture to functionally interact with a tool object. We will try to determine the neural correlates involved in the matching process.

Successful grasping involves the transformation of intrinsic object properties into motor actions [5]. Visual inspection of the object’s characteristics (size, shape, weight, texture) as well as the object’s position (distance, angle) will activate the proper motor schemas and shape the hand posture for an adequate reach and grasp movement. In monkeys, visuomotor transformations for grasping have been associated with two key cortical areas: area F5 or the rostral part of the monkey ventral premotor cortex, and area AIP or the rostral part of the intraparietal sulcus [6]. Inactivation studies of both areas resulted in impaired shaping of the hand relative to the object’s size and shape [7], [8]. Based on the characteristics of neurons in F5 and AIP, Fagg and Arbib proposed a model in which AIP uses visual input to highlight object features that are relevant for grasping it, whereas area F5 serves to select the most appropriate grasp in function of relevant constraints (visual information, task information, instructions). This decision is then relayed back to the AIP which focuses on the selected grasp and continually reinforces its inputs while F5 governs the motor execution and monitors the planned preshape and grasp [9].

In the human brain, the putative homologue for the monkey AIP was determined as the anterior segment of the intraparietal sulcus, commonly termed aIPS. Binkofski et al. documented selective deficits in the coordination of finger movements during object grasping in patients with lesions involving the aIPS [10]. These observations have been corroborated by neuroimaging studies when healthy participants perform simple prehensile actions [2], [10][13]. But the human aIPS has also been associated with action planning, recognition of goal-directed hand-object movements, and motor semantics [14][19].

The putative human homologue for the monkey F5 area is identified as the pars opercularis, the posterior part of the inferior frontal gyrus, also described as the ventral premotor cortex (vPMC). More specifically, the pars opercularis appeared implicated during the imitation of goal-oriented actions [20], observation of realized prehensile actions [21], [22] and action sequences [23], and that it is able to code action content in an abstract modality-independent fashion [24].

The findings seem to suggest that the role of the human fronto-parietal grasping system may be more complicated than that of the monkey, as both regions in the human brain seem especially sensitive to the conceptual high-level components of the transitive action. Any comparison between transitive gestures in human and non-human primates should take into account the much higher complexity of hand-related functions in humans. Only recently, a systematic comparison between human and non-human primates on the cognitive capacities deemed crucial to tool use concluded that human tool use reflects a profound discontinuity between us and our closest relatives [25].

In order to differentiate the contribution of aIPS and vPMC several studies have focused on grasp selection which, according to the Fagg & Arbib-model, should be subserved by the vPMC region. Grèzes et al. aimed to identify regions that responded to different grasp observation and execution conditions in a paradigm that required the selection of a power grip or a precision grip [26]. They found that the left vPMC and inferior frontal gyrus (IFG) Brodmann area (BA) 44 were selectively modulated during gesture imitation and gesture execution in response to objects. Buxbaum et al. compared the neural response during the selection of prehensile and non-prehensile hand postures for functional object use, versus prehensile postures used for grasping [1]. Difficulty of the experimental conditions were equated in terms of accuracy and response time. Significantly greater activations were indeed reported in the left IFG, but also in the posterior superior temporal gyrus (pSTG) and inferior parietal lobule (IPL) in the non-prehensile use condition as compared to the (prehensile) grasp condition. No differences were reported between the prehensile use condition and the (prehensile) grasp condition, and comparison of the non-prehensile use condition and prehensile use condition revealed a difference in the left IPL only. Buxbaum and colleagues interpreted their data to confirm the left IPL as a repository of hand postures for functional use [1]. A recent study by Makuuchi et al. compared the neural correlates of mimed object grasping in which the volunteers used the same or different grip types in the second presentation of an identical object. In the ‘different’ condition, taken to reflect increased selection demands, involvement of the vPMC, aIPS, and posterior inferior temporal gyrus (pITG) was found. Subsequent effective connectivity analysis suggested to the authors that the vPMC integrates the neural information of different regions (including aIPS, pITG, and dorsolateral prefrontal cortex (DLPFC)) to select the hand posture [27].

Taken together, the latter studies suggest that if grip selection is part of the grasping task, involvement of the ventral premotor region is more likely, although additional posterior parietal activation, in particular around the aIPS, is frequently observed. As a result, different interpretations for the role of putative human AIP and F5 in grasp selection have been proposed.

The aim of the present study was to focus on tool object – hand posture matching and to determine which brain areas would respond to increased matching demands. Volunteers were shown pictures of tool objects followed by pictures of hand postures that could match the functional use of the object or not (see Figure 1 for an overview of the paradigm, more details are provided in the Methods section). Their match/mismatch decision was registered with a button press. All presented stimuli were static and the task did not require actual or pantomimed grasping within the scanner environment, thus eliminating effects of motion and motor execution. The difficulty level of the experimental conditions was manipulated by selecting tool object - hand posture decisions between or within grip types in order to make the decision easy or hard. We hypothesized that conditions where demands on differentiation of hand posture and finger composition were higher, would show enhanced modulation in the neural region responsible for hand posture selection, and that this region would most likely correspond to the ventral premotor cortex.

thumbnail
Figure 1. Structure of the paradigm and examples of the four conditions.

In the experimental conditions, participants had do decide as quickly as possible whether the hand posture matched the functional use of a previously shown tool object (Match, Mismatch Easy, Mismatch Hard). In the Control condition, the volunteers had to decide whether both pictures were identical. ISI = Inter stimulus interval.

https://doi.org/10.1371/journal.pone.0070480.g001

Results

Behavioral Data

A repeated measures analysis of variance on the accuracy data revealed a significant effect of condition, F [3], [14] = 161.97, p<.001. As illustrated in Figure 2, pairwise post-hoc comparisons indicated that performance accuracy of the Control and Mismatch Easy conditions differed significantly from the Match and Mismatch Hard conditions. No significant accuracy differences were found between Control and Mismatch Easy, and between Match and Mismatch Hard.

thumbnail
Figure 2. Behavioral performance.

Left hand graph depicts percent accuracy scores in the four conditions. Right hand graph illustrates the conditions’ reaction times. Results of the post hoc paired-sample t-tests are indicated above the bars. * implies a p-value <.001. Error bars represent 95% confidence intervals of the mean.

https://doi.org/10.1371/journal.pone.0070480.g002

Similar statistics applied on response times (defined as time since first image of the sequence) also revealed a significant effect of condition, F[3], [14] = 30.53, p<.001. Post-hoc paired sample t-tests revealed significant differences between all conditions, except between Match and Mismatch Hard (Figure 2).

Importantly, these analyses confirmed a significant difference between the Mismatch Easy and Mismatch Hard conditions, with the latter showing an increased response time and reduced accuracy score. In addition, both Within grasp type choices (Match and Mismatch Hard) showed very similar accuracy and response speed data.

Neuroimaging Data

The results of the conjunction of the experimental tasks compared to the picture matching control task are listed in Table 1 and depicted in Figure 3A. This conjunction analysis revealed a uniquely left lateralized occipito-temporo-parietal activation pattern. Posterior parietal activation was observed in the aIPS region, as well as in the supramarginal gyrus and the superior parietal lobule. Note that no frontal activation survived this contrast.

thumbnail
Figure 3. Group statistical maps for the contrasts of interest.

A. Activation maps of the conjunction analysis comparing each experimental condition versus control at alpha(FDR)<0.05. B. Activation maps of the Mismatch Hard>Mismatch Easy contrast at alpha(FDR)<0.05. C. Activation maps of the Within>Between Grasp type choice conditions at alpha(FDR)<0.05. FG: fusiform gyrus; ITG: inferior temporal gyrus; MOG: middle occipital gyrus; POTZ: parieto-occipital transition zone; SMG: supramarginal gyrus; aIPS: anterior intraparietal sulcus; SPL: superior parietal lobule; vPMC: ventral premotor cortex.

https://doi.org/10.1371/journal.pone.0070480.g003

Direct comparison of the hard versus the easy mismatch hand posture – object matching task revealed selective left hemispheric modulation over the inferior frontal gyrus. Peak activity in this cluster was found over the opercular part (Brodmann area 44; See Table 1 and Figure 3B). Contrasting both more difficult Within Grasp type choice conditions (Match and Mismatch Hard) with the Between Grasp type choice condition (Mismatch Easy) resulted in a similar activation pattern with peak activity in the frontal operculum, BA44 (See Table 1 and Figure 3C).

Discussion

Compared to the picture matching control task, the experimental tool object – hand posture matching conditions jointly modulated several regions in the posterior part of the left hemisphere. This strong leftward activation during a task related to praxis in right handers is in agreement with neuropsychological and neuroimaging research [28][31].

As expected, increased modulation of the aIPS in the lateral bank (IPL) was obtained, and this region has repeatedly been implicated in prehensile movements and grasping intentions [2], [10][19], [32][34]. Inferior and lateral to this region, enhanced modulation in the supramarginal gyrus (SMG) was observed. This region on the convex portion of the inferior parietal lobule has been reported in paradigms comparing the observation and (imagined) manipulation of familiar as opposed to unfamiliar tools [18], [35]. The SMG is associated with ideomotor apraxia and is believed to store representations of the limb and hand subserving skilled object-related actions [36][39]. As our paradigm presented familiar objects and explicitly referred to hand postures necessary for their use, activation of this area is not unexpected. Two additional parietal regions, but now associated with a more dorsal position within BA 7, showed increased modulation in the experimental tasks. The first is positioned in the more anterior portion of the superior parietal lobule. Paradigms that activate this region are mainly concerned with shifts in spatial attention to moving targets [40][43], and in contrasts pertaining to the effect of perspective in action observation research [44], [45]. The second region is located more posterior and inferior within BA 7 and is described as a parieto-occipital transition zone (POTZ) in Mai et al. [46]. The junction between occipital and parietal cortex has been associated with severe misreaching in patients with so-called optic ataxia, a deficit in motor control characterized by poor and awkward reach trajectories and grasping of objects in the peripheral visual fields [47]. These neuropsychological findings have been tallied by neuroimaging studies showing that the activity in POTZ reflects coding of reach direction and the transport component of reaches [32], [48], [49]. Activation of both dorsal regions in this contrast seem to suggest that our participants may have imagined reaching for and grasping the presented objects in order to comply with the experimental tasks.

Extra-parietal modulation was unveiled in the left occipital (middle occipital gyrus, MOG) and temporal lobes (inferior temporal gyrus, ITG, and fusiform gyrus, FG). Neural activation caused by object stimuli is likely to be reflected in visual areas that are concerned with object recognition such as the fusiform cortex and the lateral occipital complex [14], [35], [50][52]. This would explain the activation in occipital and inferior temporal (including fusiform) regions in our volunteers. On the other hand, it can be argued that a similar kind and number of objects were used in the control condition. Why then would there be a higher modulation of these ventral regions in the experimental conditions? A possible explanation could be that in the experimental tasks the focus is not only on object identification, but also on the motor affordances of the depicted object, as the participant will have to compare the object’s structure against a hand posture. Research has shown that motor affordances are most readily determined by the object’s physical appearance, rather than by its conceptual information [53]. In addition, it has been suggested that the processing carried out in the fusiform gyrus may be more responsive to the object’s structure, than to its meaning [54]. If we combine these two lines of evidence, it becomes plausible to obtain elevated modulation during the experimental tasks, at least in the fusiform gyrus, because it is the object’s structure that conveys the most relevant information to solve the task. Note that in this conjunction analysis, no frontal activation, in particular of the vPMC, was encountered. This was mainly due to the fact that in the ‘Mismatch Easy>Control’ part of the conjunction no significant vPMC activation was obtained.

The behavioral data revealed a successful manipulation of the mismatch conditions’ difficulty level. Selecting a mismatch between posture types resulted in higher accuracy scores and faster response times than deciding on a mismatch within a hand posture type. In the Within Grasp type choice conditions the Match condition appeared to be equally difficult than the Mismatch Hard condition, as subtle differences within hand posture types had to be considered here too. Comparison of easy versus more difficult conditions was taken to reflect selective modulation in those brain areas that would have to deal with this increased task demand. In the Mismatch Hard>Mismatch Easy contrast, substantial response to task difficulty was elicited in the left ventral premotor cortex, in particular in pars opercularis (BA 44) of the inferior frontal gyrus. The same region was active in the more general Within>Between Grasp type choice contrast. These findings are in agreement with other studies that targeted the hand posture selection process and found vPMC activation among other activated regions. The merit of the present study is that it highlights the selective response of this region to differing demands in the discrimination of hand posture choice [26], [27], [37]. The selective involvement of vPMC in hand posture discrimination relative to object properties remains in agreement with the functional role of primate F5 as proposed by Fagg & Arbib [9], despite the increased complexity of transitive actions in humans. Rizzolatti et al. reported that of all the neurons active during grasping in the macaque’s F5 region, 85% were selective to specific types of prehension, the most frequent being a precision grip [55]. In humans, precision grips also revealed stronger modulation in the vPMC/BA 44 area (among other regions) compared to power grips, in particular when small grip forces rather than excessive grip forces were applied [56], [57]. In addition, it has been shown that the usual muscle-specific vPMC-PM interactions that appeared during grasp preparation were significantly reduced following aIPS perturbation (TMS), and that this disruption was behaviorally associated with a reduced grasp-specific pattern of digit muscle activity [58]. These findings and the results of the current study suggest that stronger demands on task-related muscular configurations, whether reflecting finger movement, hand posture, or fingertip force control appear to engage a primate’s ventral premotor cortex. Future research should determine this region’s selectivity for prehensile (as compared to non-prehensile) object-related gestures, provide more direct proof of a close relation between parametric variation in motor-muscular complexity (computational demand) and vPMC BOLD response, and ascertain whether these demands reveal multiple vPMC representations for separate transitive qualities such as posture, movement, or force.

Methods

Stimuli

Twenty-eight familiar tool objects were selected, a list of which can be found in Appendix S1. Functional use of these tools would require a power grip (n = 12), precision grip (n = 10), poke posture (n = 3), or palm posture (n = 3). Healthy participants can reliably associate these four hand postures to the use of objects and in daily life prehensile postures are more common than non-prehensile postures [59]. Each object was photographed in a comfortable right hand grasp position using a Canon EOS 300D digital reflex camera. The right handed experimenter (GV) then grasped the object in a functional manner, and carefully removed the object out of his grip while maintaining the hand posture for that particular object’s use. Again, a still picture of that hand posture was made. All object and hand posture stimuli were depicted on a neutral grey background. Based on these 28 static pictures of objects and their corresponding 28 functional hand positions, four conditions were created (Figure 1). For the first two conditions an object was paired with a hand posture that was compatible with its functional use (for example a dart with a precision grip), but could be either the proper precision grip for that particular object’s use (Match decision) or an incorrect precision grip (Mismatch decision). As these conditions require discriminations of compatible grasps, they are referred to as ‘Within Grasp type choice’. In both cases, the decision has to be based on a careful consideration of the correspondence between the object’s size, shape, and inclination with the precise hand posture (finger or clench aperture, hand inclination, etc.). These conditions are likely to be difficult, and this mismatch decision is referred to as the Mismatch Hard condition. In a third condition, each object was paired with a hand posture that belonged to a different hand posture category, and is referred to as ‘Between Grasp type choice’. For example by combining a key (precision grip) with a power grip posture. In this condition the mismatch between object and hand posture was relatively easy to determine, and this condition was described as Mismatch Easy. Finally, in a Control condition, an object or a hand posture image was paired with either the same or a different object or hand posture picture respectively. In this condition, we coupled 14 object-object pairs and 14 hand-hand pairs, so that the visual input was identical in all conditions. Thus four sets of 28 stimulus pairs were created that made up the four conditions of the experiment.

Participants

Seventeen healthy volunteers participated in the study (age range: 20−40 years, mean age: 23.3; 11 women and 6 men). All were right-handed as determined by the Edinburgh Handedness Inventory: M = 93.6%, SD = 8.8% [60] and none had a history of neurological or psychiatric disease. Scanning protocols were approved by the Ethics Committee of the University Hospital Ghent and all subjects gave written informed consent after the experimental procedure had been explained to them.

Procedure

Prior to scanning, the volunteers completed a pre-scan MRI-safety questionnaire and the Edinburgh Handedness Inventory. They were instructed that each experimental trial would start with a blue fixation cross. After the fixation cross, they were going to see a picture of a tool object followed by a picture of a hand posture, and they would have to decide as quickly as possible whether the hand posture shown corresponded to the functional use of the previously presented object. If the trial started with a red fixation cross, the pictures could depict two consecutive objects or hand postures. In that case they would have to decide whether both images were identical or different. If they decided that the hand posture matched the functional use of the paired object, or if both stimuli were the same, they had to press the right button of an MR compatible button press with their left index finger. If they felt that the hand posture did not match this object’s use, or if both pictures depicted different objects or hand postures, they had to press the left button with their left middle finger. We made it clear that accuracy was more important than speed, but once decided, a timely response had to be made.

The volunteers were positioned head first and supine in the magnet with their left and right arms placed alongside the body on the scanner table. The button press was placed on the scanner table under the left hand and was controlled with the middle and index fingers. Participants were reminded of the fact that MR-imaging is very sensitive to movement and were required to restrict head movements and to lie as still as possible in order to prevent motion artifacts. Their heads were gently fixed in place with foam cushions and stimuli were presented through goggles with an MRI-compatible presentation system (VisuaStim-Digital, Resonance Technology Inc., California, USA).

Stimulus presentation and response recording was controlled by a commercially available experiment generator (Presentation, Neurobehavioral Systems Inc., Albany CA, USA). Each trial started with a 2000 ms fixation cross (blue cross: tool object - hand posture match; red cross: picture match). Next, the first picture of the stimulus pair appeared on the screen for 2000 ms, followed by a variable interval (mean interval time = 200 ms). After the interval the second picture appeared for 2000 ms. Each stimulus pair was shown twice: 4 conditions× (2×28 stimulus pairs) = 224 trials. The paradigm was arranged as a permuted block-design with four conditions: Match, Mismatch Easy, Mismatch Hard, and Control. A permuted block design was chosen to avoid psychological confounds associated with traditional block designs, such as habituation and anticipation. In comparison with event related designs, permuted block designs also obtain advantageous trade-offs between efficiency, detection power, and conditional entropy or randomness. Permutation was achieved by exchanging the positions of two randomly chosen events in a classic block design, that with each iteration (n = 100) became increasingly random [61]. In total, the experiment took 23 minutes (224 trials of 6200 ms each), and stimuli were randomly distributed over their conditions’ (permuted) blocks. In the post-scan session, participants completed a post-scan MRI safety questionnaire and were debriefed.

Data Acquisition

Scanning was performed at 3.0 T on a Siemens Trio MRI scanner (Siemens Medical Systems, Erlangen, Germany) that was equipped with echo planar imaging (EPI) capabilities and used an 8-channel PA head coil for radio frequency transmission and signal reception. After automatic shimming of the magnetic field on each participant, a 3-D high-resolution T 1 anatomical image of the whole brain in the sagittal plane was acquired for coregistration with the functional images (3D MPRAGE, 176 slices, slice thickness = 0.9, in-plane resolution = 0.9×0.9 mm, TR = 2530 ms, TE = 2.58). Next, 560 functional EPI images in the axial plane were acquired for the matching paradigm with the following parameters: TR = 2.5 s, TE = 33 ms; flip angle = 90°, 33 slices, slice thickness = 2.5 mm, slice gap = 1.25 mm, FOV = 192 mm and matrix = 64×64, resulting in a resolution of 3×3×2.5 mm.

Image Analysis

Data analysis was performed using Brain Voyager QX for preprocessing and statistical inference [62]. Functional data were subjected to a standard sequence of preprocessing steps comprising slice scan time correction by means of sinc interpolation, 3-D motion correction by spatial alignment to the first volume also by means of sinc interpolation, and temporal filtering using linear trend removal and high pass filtering for low-frequency drifts of 3 or fewer cycles. Spatial smoothing with a Gaussian filter (FWHM = 8 mm) was applied for the volume-based analysis. The anatomical data for each subject were resampled to a 1×1×1 mm resolution. Transformation into Talairach standard space was performed in two steps. In the first step, the cerebrum is translated and rotated into the AC-PC plane (AC = anterior commissure, PC = posterior commissure). In the second step, the borders of the cerebrum are identified; in addition with the AC and PC points, the size of the brain is fitted into standard space. We used sinc interpolation as the transformation method as it applies no implicit smoothing. The functional data for each subject were coregistered with the subject’s 3-D anatomical dataset and transformed into Talairach space. After coregistration, a volume time course of the functional data was created and resampled into a cubic voxel of 3×3×3 mm.

For each subject’s paradigm, a protocol file was derived representing the period from the onset of the stimulus until the participant’s response for each trial of the different conditions. Factorial design matrices were automatically defined from the created protocols. The BOLD response in each condition was modeled by convolving these neural functions with a canonical hemodynamic response function (gamma) to form covariates in a General Linear Model (GLM). After the GLM had been fitted and the effects of temporal serial correlation allowed for (using AR(1) modeling, see [63]), group (random effects procedure) t-maps were generated to evaluate the effects of hand posture – object matching. First, we determined the general effect of hand posture selection by performing a conjunction analysis of all posture-object match conditions compared to the control (picture match) condition: (Match>Control) ∩ (Mismatch Easy>Control) ∩ (Mismatch Hard>Control). This conjunction analysis was executed on a whole-brain analysis. Second, we directly contrasted the easy and hard mismatch conditions to determine the neural correlates of more demanding hand posture selection: Mismatch Hard>Mismatch Easy. The comparison between Mismatch Hard and Mismatch Easy is the most straightforward comparison as both conditions require exactly the same response, namely a negative decision (mismatch) followed by a left middle finger press. We also performed the more general contrast of Within>Between Grasp type choice, namely Match+Mismatch Hard >2 Mismatch Easy. Indeed, the behavioral data revealed a similar difficulty level for the Match and the Mismatch Hard conditions (see below), which is not unexpected given that both conditions reflect a within grasp type decision. For all analyses, we used a threshold of p<.05 corrected for multiple comparisons using False Discovery Rate (FDR) correction [64]. Areas of significant activation were identified using the brain atlases of Mai et al. [46] and Talairach and Tournoux [65].

Supporting Information

Appendix S1.

List of the familiar tool objects used in the paradigm.

https://doi.org/10.1371/journal.pone.0070480.s001

(DOC)

Author Contributions

Conceived and designed the experiments: GV JN PV. Performed the experiments: GV JN PH EV. Analyzed the data: GV JN. Contributed reagents/materials/analysis tools: PH EV PV. Wrote the paper: GV.

References

  1. 1. Buxbaum LJ, Kyle KM, Tang K, Detre JA (2006) Neural substrates of knowledge of hand postures for object grasping and functional object use: Evidence from fMRI. Brain Research 1117: 175–185.
  2. 2. Culham JC, Danckert SL, Desouza JFX, Gati JS, Menon RS, et al. (2003) Visually guided grasping produces fMRI activation in dorsal but not ventral stream brain areas. Experimental Brain Research 153: 180–189.
  3. 3. Culham JC, Kanwisher NG (2001) Neuroimaging of cognitive functions in human parietal cortex. Current Opinion in Neurobiology 11: 157–163.
  4. 4. Daprati E, Sirigu A (2006) How we interact with objects: learning from brain lesions. Trends in Cognitive Sciences 10: 265–270.
  5. 5. Jeannerod M, Arbib MA, Rizzolatti G, Sakata H (1995) Grasping Objects - the Cortical Mechanisms of Visuomotor Transformation. Trends in Neurosciences 18: 314–320.
  6. 6. Castiello U (2005) The neuroscience of grasping. Nature Reviews Neuroscience 6: 726–736.
  7. 7. Fogassi L, Gallese V, Buccino G, Craighero L, Fadiga L, et al. (2001) Cortical mechanism for the visual guidance of hand grasping movements in the monkey - A reversible inactivation study. Brain 124: 571–586.
  8. 8. Gallese V, Murata A, Kaseda M, Niki N, Sakata H (1994) Deficit of Hand Preshaping After Muscimol Injection in Monkey Parietal Cortex. Neuroreport 5: 1525–1529.
  9. 9. Fagg AH, Arbib MA (1998) Modeling parietal-premotor interactions in primate control of grasping. Neural Networks 11: 1277–1303.
  10. 10. Binkofski F, Dohle C, Posse S, Stephan KM, Hefter H, et al. (1998) Human anterior intraparietal area subserves prehension - A combined lesion and functional MRI activation study. Neurology 50: 1253–1259.
  11. 11. Frey SH, Vinton D, Norlund R, Grafton ST (2005) Cortical topography of human anterior intraparietal cortex active during visually guided grasping. Cognitive Brain Research 23: 397–405.
  12. 12. Begliomini C, Nelini C, Caria A, Grodd W, Castiello U (2008) Cortical Activations in Humans Grasp-Related Areas Depend on Hand Used and Handedness. Plos One 3: e3388.
  13. 13. Grafton ST, Fagg AH, Woods RP, Arbib MA (1996) Functional anatomy of pointing and grasping in humans. Cerebral Cortex 6: 226–237.
  14. 14. Shmuelof L, Zohary E (2005) Dissociation between ventral and dorsal fMRl activation during object and action recognition. Neuron 47: 457–470.
  15. 15. Hamilton AFD, Grafton ST (2006) Goal representation in human anterior intraparietal sulcus. Journal of Neuroscience 26: 1133–1137.
  16. 16. Ortigue S, Thompson JC, Parasuraman R, Grafton ST (2009) Spatio-Temporal Dynamics of Human Intention Understanding in Temporo-Parietal Cortex: A Combined EEG/fMRI Repetition Suppression Paradigm. Plos One 4: e6962.
  17. 17. Vingerhoets G, Honoré P, Vandekerckhove E, Nys J, Vandemaele P, et al. (2010) Multifocal Intraparietal Activation During Discrimination of Action Intention in Observed Tool Grasping. Neuroscience 169: 1158–1167.
  18. 18. Vingerhoets G, Acke F, Vandemaele P, Achten E (2009) Tool responsive regions in the posterior parietal cortex: Effect of differences in motor goal and target object during imagined transitive movements. Neuroimage 47: 1832–1843.
  19. 19. Tunik E, Rice NJ, Hamilton A, Grafton ST (2007) Beyond grasping: Representation of action in human anterior intraparietal sulcus. Neuroimage 36: T77–T86.
  20. 20. Koski L, Wohlschlager A, Bekkering H, Woods RP, Dubeau MC, et al. (2002) Modulation of motor and premotor activity during imitation of target-directed actions. Cerebral Cortex 12: 847–855.
  21. 21. Molnar-Szakacs I, Iacoboni M, Koski L, Mazziotta JC (2005) Functional segregation within pars opercularis of the inferior frontal gyrus: Evidence from fMRI studies of imitation and action observation. Cerebral Cortex 15: 986–994.
  22. 22. Johnson-Frey SH, Maloof FR, Newman-Norlund R, Farrer C, Inati S, et al. (2003) Actions or hand-object interactions? Human inferior frontal cortex and action observation. Neuron 39: 1053–1058.
  23. 23. Molnar-Szakacs I, Kaplan J, Greenfield PM, Iacoboni M (2006) Observing complex action sequences: The role of the fronto-parietal mirror neuron system. Neuroimage 33: 923–935.
  24. 24. Baumgaertner A, Buccino G, Lange R, McNamara A, Binkofski F (2007) Polymodal conceptual processing of human biological actions in the left inferior frontal lobe. European Journal of Neuroscience 25: 881–889.
  25. 25. Vaesen K (2012) The cognitive bases of human tool use. Behavioral and Brain Sciences 35: 203–263.
  26. 26. Grezes J, Armony JL, Rowe J, Passingham RE (2003) Activations related to “mirror” and “canonical” neurones in the human brain: an fMRI study. Neuroimage 18: 928–937.
  27. 27. Makuuchi M, Someya Y, Ogawa S, Takayama Y (2012) Hand shape selection in pantomimed grasping: Interaction between the dorsal and the ventral visual streams and convergence on the ventral premotor area. Hum Brain Mapp 33: 1821–1833.
  28. 28. Goldenberg G, Spatt J (2009) The neural basis of tool use. Brain 132: 1645–1655.
  29. 29. Vingerhoets G, Acke F, Alderweireldt A-S, Nys J, Vandemaele P, et al. (2011) Cerebral lateralization of praxis in right- and left-handedness: Same pattern, different strength. Human Brain Mapping 33: 763–777.
  30. 30. Haaland KY, Harrington DL, Knight RT (2000) Neural representations of skilled movement. Brain 123: 2306–2313.
  31. 31. Buxbaum LJ, Kyle KM, Menon R (2005) On beyond mirror neurons: Internal representations subserving imitation and recognition of skilled object-related actions in humans. Cognitive Brain Research 25: 226–239.
  32. 32. Cavina-Pratesi C, Monaco S, Fattori P, Galletti C, Mcadam TD, et al. (2010) Functional Magnetic Resonance Imaging Reveals the Neural Substrates of Arm Transport and Grip Formation in Reach-to-Grasp Actions in Humans. Journal of Neuroscience 30: 10306–10323.
  33. 33. Tunik E, Frey SH, Grafton ST (2005) Virtual lesions of the anterior intraparietal area disrupt goal-dependent on-line adjustments of grasp. Nature Neuroscience 8: 505–511.
  34. 34. Gallivan JP, Mclean DA, Smith FW, Culham JC (2011) Decoding Effector-Dependent and Effector-Independent Movement Intentions from Human Parieto-Frontal Brain Activity. Journal of Neuroscience 31: 17149–17168.
  35. 35. Vingerhoets G (2008) Knowing about tools: Neural correlates of tool familiarity and experience. Neuroimage 40: 1380–1391.
  36. 36. Buxbaum LJ, Johnson-Frey SH, Bartlett-Williams M (2005) Deficient internal models for planning hand-object interactions in apraxia. Neuropsychologia 43: 917–929.
  37. 37. Buxbaum LJ, Sirigu A, Schwartz MF, Klatzky R (2003) Cognitive representations of hand posture in ideomotor apraxia. Neuropsychologia 41: 1091–1113.
  38. 38. Pelgrims B, Andres M, Seron X, Duhamel JR, Sirigu A, et al. (2005) Role of the left supramarginalis gyrus in coding hand postures for object use. Journal of Cognitive Neuroscience 17: 84.
  39. 39. Tunik E, Lo OY, Adamovich SV (2008) Transcranial Magnetic Stimulation to the Frontal Operculum and Supramarginal Gyrus Disrupts Planning of Outcome-Based Hand-Object Interactions. Journal of Neuroscience 28: 14422–14427.
  40. 40. Gitelman DR, Nobre AC, Parrish TB, Labar KS, Kim YH, et al. (1999) A large-scale distributed network for covert spatial attention - Further anatomical delineation based on stringent behavioural and cognitive controls. Brain 122: 1093–1106.
  41. 41. Vandenberghe R, Gitelman DR, Parrish TB, Mesulam MM (2001) Functional specificity of superior parietal mediation of spatial shifting. Neuroimage 14: 661–673.
  42. 42. Molenberghs P, Mesulam MM, Peeters R, Vandenberghe RRC (2007) Remapping attentional priorities: Differential contribution of superior parietal lobule and intraparietal sulcus. Cerebral Cortex 17: 2703–2712.
  43. 43. Rushworth MFS, Paus T, Sipila PK (2001) Attention systems and the organization of the human parietal cortex. Journal of Neuroscience 21: 5262–5271.
  44. 44. Hesse MD, Sparing R, Fink GR (2009) End or Means - The “What” and “How” of Observed Intentional Actions. Journal of Cognitive Neuroscience 21: 776–790.
  45. 45. Vingerhoets G, Stevens L, Meesdom M, Honore P, Vandemaele P, et al. (2012) Influence of perspective on the neural correlates of motor resonance during natural action observation. Neuropsychological Rehabilitation 22: 752–767.
  46. 46. Mai JK, Paxinos G, Voss T (2008) Atlas of the human brain. Amsterdam: Elsevier.
  47. 47. Karnath HO, Perenin MT (2005) Cortical control of visually guided reaching: Evidence from patients with optic ataxia. Cerebral Cortex 15: 1561–1569.
  48. 48. Prado J, Clavagnier S, Otzenberger H, Scheiber C, Kennedy H, et al. (2005) Two cortical systems for reaching in central and peripheral vision. Neuron 48: 849–858.
  49. 49. Vesia M, Prime SL, Yan XG, Sergio LE, Crawford JD (2010) Specificity of Human Parietal Saccade and Reach Regions during Transcranial Magnetic Stimulation. Journal of Neuroscience 30: 13053–13065.
  50. 50. Chao LL, Haxby JV, Martin A (1999) Attribute-based neural substrates in temporal cortex for perceiving and knowing about objects. Nature Neuroscience 2: 913–919.
  51. 51. Creem-Regehr SH, Lee JN (2005) Neural representations of graspable objects: are tools special? Cognitive Brain Research 22: 457–469.
  52. 52. Grill-Spector K, Malach R (2004) The human visual cortex. Annual Review of Neuroscience 27: 649–677.
  53. 53. Vingerhoets G, Vandamme K, Vercammen A (2009) Conceptual and physical object qualities contribute differently to motor affordances. Brain and Cognition 69: 481–489.
  54. 54. Whatmough C, Chertkow H, Murtha S, Hanratty K (2002) Dissociable brain regions process object meaning and object structure during picture naming. Neuropsychologia 40: 174–186.
  55. 55. Rizzolatti G, Camarda R, Fogassi L, Gentilucci M, Luppino G, et al. (1988) Functional-Organization of Inferior Area-6 in the Macaque Monkey.2. Area F5 and the Control of Distal Movements. Experimental Brain Research 71: 491–507.
  56. 56. Ehrsson HH, Fagergren A, Jonsson T, Westling G, Johansson RS, et al. (2000) Cortical activity in precision- versus power-grip tasks: An fMRI study. Journal of Neurophysiology 83: 528–536.
  57. 57. Ehrsson HH, Fagergren A, Forssberg H (2001) Differential fronto-parietal activation depending on force used in a precision grip task: An fMRI study. Journal of Neurophysiology 85: 2613–2623.
  58. 58. Davare M, Rothwell JC, Lemon RN (2010) Causal Connectivity between the Human Anterior Intraparietal Area and Premotor Cortex during Grasp. Current Biology 20: 176–181.
  59. 59. Klatzky RL, Mccloskey B, Doherty S, Pellegrino J, Smith T (1987) Knowledge About Hand Shaping and Knowledge About Objects. Journal of Motor Behavior 19: 187–213.
  60. 60. Oldfield RC (1971) The assessment and analysis of handedness. Neuropsychologia 9: 97–113.
  61. 61. Liu TT (2004) Effliciency, power, and entropy in event-related fMRI with multiple trial types - Part II: Design of experiments. Neuroimage 21: 401–413.
  62. 62. Goebel R, Esposito F, Formisano E (2006) Analysis of Functional Image Analysis Contest (FIAC) data with BrainVoyager QX: From single-subject to cortically aligned group general linear model analysis and self-organizing group independent component analysis. Hum Brain Mapp 27: 392–401.
  63. 63. Bullmore E, Brammer M, Williams SCR, Rabehesketh S, Janot N, et al. (1996) Statistical methods of estimation and inference for functional MR image analysis. Magnetic Resonance in Medicine 35: 261–277.
  64. 64. Genovese CR, Lazar NA, Nichols T (2002) Thresholding of statistical maps in functional neuroimaging using the false discovery rate. Neuroimage 15: 870–878.
  65. 65. Talairach J, Tournoux P (1988) Co-planar sereotaxic atlas of the human brain. Stuttgart: G. Thieme.