Advertisement
Research Article

Hands as Sex Cues: Sensitivity Measures, Male Bias Measures, and Implications for Sex Perception Mechanisms

  • Justin Gaetano mail,

    justin.gaetano@scu.edu.au

    Affiliation: Cognitive Neuroscience Research Cluster, Southern Cross University, Coffs Harbour, Australia

    X
  • Rick van der Zwan,

    Affiliation: Cognitive Neuroscience Research Cluster, Southern Cross University, Coffs Harbour, Australia

    X
  • Duncan Blair,

    Affiliation: Cognitive Neuroscience Research Cluster, Southern Cross University, Coffs Harbour, Australia

    X
  • Anna Brooks

    Affiliation: Cognitive Neuroscience Research Cluster, Southern Cross University, Coffs Harbour, Australia

    X
  • Published: March 06, 2014
  • DOI: 10.1371/journal.pone.0091032

Abstract

Sex perceptions, or more particularly, sex discriminations and sex categorisations, are high-value social behaviours. They mediate almost all inter-personal interactions. The two experiments reported here had the aim of exploring some of the basic characteristics of the processes giving rise to sex perceptions. Experiment 1 confirmed that human hands can be used as a cue to an individual’s sex even when colour and texture cues are removed and presentations are brief. Experiment 1 also showed that when hands are sexually ambiguous observers tend to classify them as male more often than female. Experiment 2 showed that “male bias” arises not from sensitivity differences but from differences in response biases. Observers are conservative in their judgements of targets as female but liberal in their judgements of targets as male. These data, combined with earlier reports, suggest the existence of a sex-perception space that is cue-invariant.

Introduction

The ability to quickly and accurately to discriminate whether another individual is female or male is considered one of only a few automatic and thus fundamental aspects of person perception [1]. The development of perceptual processes capable of discriminating, at a distance, another’s sex conveys considerable advantage by priming a range of different behaviours [2][6]. Of course, for an observer to be able to discriminate another’s sex, there must exist cues that are sexually dimorphic and predictably so. The obvious candidates are the primary sex cues. In many animals, however, body posture or body morphology often conceals the genitalia from view. In upright humans the primary sex cues are not concealed by morphology but have nonetheless typically been obscured, probably since humans first took to wearing clothes some 170 000 years ago [7]. There are, however, other sexually dimorphic visual [8][12], auditory [13], and olfactory [14] cues that humans can exploit, either alone [15][19] or in combination [2], [20][23] to perform the task of sex discrimination. We do so, most of the time, with remarkable precision.

Of the modalities mentioned, the one for which the most extensive sex perception literature has developed is vision. Studies within that literature use either whole-body or partial-body representations to systematically explore the processes mediating sex perceptions. In the case of the whole-body literature, studies have generally focussed on sexually dimorphic structural [24], [25] and/or kinematic cues to sex [18], [26][29]. By comparison, studies of partial-body sex perceptions employ, almost exclusively, images of the face as stimuli [16], [19], [30], [31], although hands also have been used [32]. No matter which stimulus type is used, there are some interesting convergent findings. For example, whole-body representations [18], [33], faces [34][36], and hands [32] all can be used to elicit sex aftereffects. For whole-body and for face stimuli there also are reliable reports of a so-called male bias, a tendency to report normally dimorphic stimuli as looking male (rather than female) when the dimorphic cues are ambiguous [17], [18], [29], [37][42].

That pattern of convergence suggests the mechanisms processing sex cues might operate in such a way as to give rise to a multi-dimensional sex perception-space, analogous to the space already proposed for face perceptions [17], [43][45]. In such a space cues to “femaleness” and cues to “maleness” would converge, independent of their source. Rather than only cue-dependent, or perhaps sense-dependent mechanisms mediating sex perceptions, such a space would be adaptive [46], taking into account all available and relevant information.

In addition to the apparent ubiquity of the male bias, the observations that object-to-face [47], body-to-face [48], and foot-fall-to-gait [23] sex aftereffects manifest certainly support that idea. There is some evidence, however, that the processes handling sex cues, similar or not, are independent of each other. For example, Kovács et al. [32] found sex aftereffects only when adaptor and test stimuli were the same body part – faces or hands. No effects were observed when adaptor and test respectively depicted faces and hands (or hands and faces). That is evidence against convergent processing, and suggests “femaleness” and “maleness” are stimulus-dependent.

With that in mind, the experiments reported here were designed to begin to explore the proposal that there exists a multi-dimensional sex-perception space built around all available sexually dimorphic cues. Such a space would be supported, in the cortex, by higher-order mechanisms onto which sensory processes converge. Those mechanisms, like any in the cortex, will give rise to predictable behaviours characteristic of sex perceptions. Thus, and in particular, it was hypothesised that if such a space exists, the male bias reported already for full-body and face perceptions will manifest also for hands. In comparison to the vast literature describing face perceptions the literature around hands is pauce. The two experiments reported here unpack more fully observers’ sensitivity to hand-based sex cues and, in that context, the perceptual processes giving rise to that sensitivity.

Methods

Ethics Statement

All observers gave written, informed consent prior to participating in the study. Ethical approval for this study was granted by the Human Research Ethics Committee of Southern Cross University (ECN-10-115). This study complies with the ethical standards specified by the Declaration of Helsinki.

Observers and Apparatus

Originally, 12 experienced psychophysical observers (6 female, 6 male) participated in this study, but data corresponding to 1 male were discarded due to apparent response confusion. All trials were conducted in a light- and sound-attenuated psychophysics laboratory using a computer monitor linearised for luminance. Responses were recorded via key-press on a standard computer keyboard. Stimuli were presented using ePrime software (Psychology Software Tools, 2011).

Stimuli

Digital photographs were taken of both the dorsal and palmar surfaces of 15 female and 15 male hands. Those photographs formed the basis of the stimulus sets used in these experiments. Hand models were excluded from participation as experimental observers. Hand posture was standardised (fingers together with thumb held close to the first finger). All adornments (bracelets, rings, and so on) were removed before photographs were taken. For standardisation purposes, each hand was placed on a grid of 1×1 cm squares, such that the middle finger was aligned both with the centre axis of the grid and the length of the forearm.

Across images, orientation was normalised and sex-stereotypic cues (e.g. long fingernails, tattoos, and scars) were removed digitally via Photoshop CS4 (Adobe Systems, 2008). To control for global size-based heuristics (i.e. observers might base their categorisations of sex solely on the appearance of each hand as ‘small’ or ‘large’) the absolute size of the stimuli were controlled. Each exemplar was scaled to a standard size whilst simultaneously maintaining its natural height-to-width proportions [16], [17].

Absolute size scaling was achieved using two techniques: In one stimulus set, the overall size (as indexed by total pixel count) was reduced to that of the smallest hand (44,693 px, ±10%); in the other, it was enlarged to that of the largest hand (89,394 px, ±10%). In the reduced set, the number of pixels difference between each standardised exemplar and the median was calculated separately for female and male hand stimuli. A mixed Analysis of Variance (ANOVA) allowed us to confirm that the variance across standardised pixel counts was the same irrespective of stimulus sex (F1,28 = 1.36, p = .253), stimulus surface (palmar, dorsal; F1,28 = 0.34, p = .565), or the interaction between those two factors (F1,28 = 0.58, p = .453).

The purpose of employing both the enlargement and reduction method was to reduce the likelihood that artefacts associated with one or the other method could be used to explain any observed results. For instance, there is the risk that manipulating the size of a digital image might result in the visible loss of that image’s clarity or integrity, meaning the largest (or smallest) exemplars are potentially degraded the most when reduced (or enlarged) to a standard size.

Thus standardised for absolute size, stimuli were further manipulated in line with techniques from the face literature [17] to form two distinct sub-sets: one in which all hue and texture information was preserved (‘colour’ condition) and another in which those cues were removed (‘silhouette’ condition). In total then, the omnibus stimulus set comprised 240 images (30 hands [15 female, 15 male]×2 surfaces [dorsal, palmar]×2 conditions [colour, silhouette]×2 absolute size manipulations [reduced, enlarged]). Reduced stimulus exemplars are represented in Figure 1.

thumbnail

Figure 1. Stimulus exemplars used in these experiments.

Hand stimuli were standardised for absolute size via a process of reduction (depicted) and enlargement. Within each stimulus condition 15 female and 15 male exemplars were represented. Each reduced and enlarged image had an absolute pixel count of 44,693 px and 89,394 px respectively (±10%).

doi:10.1371/journal.pone.0091032.g001

Procedure

Both experiments used a two-alternative forced-choice design (2AFC) but differed with respect to response options and subsequent analyses. Each trial comprised in chronological order: a blank screen for 1000 ms, a stimulus presentation lasting 125 ms or 1000 ms, and a response screen (centred cross, +, on black background) that extinguished when either the observer made a response or 1000 ms had passed. The order in which blocks were presented was counterbalanced, as was the response key associated with each of the alternatives.

In Experiment 1, colour and silhouette images were presented in randomised order in blocks defined by presentation duration (125 ms or 1000 ms) and absolute size manipulation (reduced or enlarged). The observer’s task after each presentation was to indicate via key press whether the image represented a ‘female’ or ‘male’ hand. A subset of the stimuli used in Experiment 1 was employed for Experiment 2. To reduce the load on observers, and because no differences between the two different sized stimulus sets were observed (see Experiment 1: Results), the size-reduced stimulus set was arbitrarily selected for use. Similarly, because they were the most ambiguous stimuli used in Experiment 1, silhouette images were used in Experiment 2. Each image was presented twice across two target blocks (female/not female, male/not male) and two presentation durations (125 ms, 1000 ms) for a total of 240 trials.

Analyses

Performances were averaged across viewing surface (palmer, dorsal) and then the potential mediating variables of absolute size and observer sex were analysed via omnibus ANOVA. Performances by each observer were then calculated as an average on all trials on each condition of interest (hue/texture [colour, silhouette], stimulus sex [female, male], and presentation duration [125 ms, 1000 ms]). Predictions were formally tested via planned orthogonal contrasts [49] designed to test each hypothesis in each experiment.

In Experiment 1, the dependent variable was the mean proportion of sex classification errors (i.e. pressing the ‘male’ key in response to a female hand or the ‘female’ key in response to a male hand). Data from Experiment 2 were analysed for two factors: sensitivity and bias, indexed by d-prime (d’) and criteria (c) scores, respectively [50]. Sensitivity here represents the ability of an observer to distinguish between target present (female- or male-) trials and target absent (female- or male-) trials. Sensitivity scores of zero indicate no sensitivity or chance performance: Observers cannot distinguish between target present and target absent trials. Similarly, increasingly positive d’ scores indicate increased sensitivity or increased ability to discriminate between target present and target absent trials. Bias represents the tendency for an observer to respond “yes” to a target (negative c scores) or “no” (positive c scores) independent of their sensitivity.

Experiments

Experiment 1

Results.

There is already data suggesting hands are a useful cue to an individual’s sex [32], [51]. Experiment 1 was designed to establish some of the parameters mediating sex discriminations from hand cues and to test for the existence of a male bias. More specifically, this experiment addressed three questions: When discriminating sex using hand cues (i) Is sex discrimination accuracy mediated by presentation duration? (ii) Is sex discrimination accuracy mediated by the availability of colour and texture cues? (iii) Are the effects of presentation duration and colour and texture information equivalent for both female and male hands? It was hypothesised sex discrimination performances would be best when colour and texture cues were present at longer presentation durations. It was also predicted that as stimulus ambiguity increased (when colour and texture cues were absent at short presentation durations) discrimination accuracy would, consistent with the existence of a male bias, decline more for female hands than for male hands.

Proportions of sex classification errors were calculated for each condition. An initial two-way mixed ANOVA was used to compare errors, across conditions, on the two absolute size conditions (small, large) and between female and male observers. There were no significant differences in error rates between the two sets of normalised hands (F1,9 = 0.28, p = .608), or between female and male observers (F1,9 = 0.64, p = .443), and no significant interaction between those variables (F1,9 = 0.83, p = .387). With that in mind, data were collapsed across those two variables and mean error rates calculated for each “hue/texture” condition (colour, silhouette), for each stimulus sex (female and male), at both presentation durations. Those means are shown in Figure 2.

thumbnail

Figure 2. Sex classification error rates.

Group proportions of sex classification errors in response to female (a) and male (b) hand images presented for 125 ms and for 1000 ms. Performances are shown for hands with colour/texture cues (open circles) and without (silhouettes: filled circles). Vertical bars represent ±1 SEM. Chance performance is represented by the dashed line. Performances above that line represent a systematic tendency to misreport stimulus sex.

doi:10.1371/journal.pone.0091032.g002

As shown in Figure 2, when observers were presented with “coloured” hands (i.e. hue and texture preserved) error rates for female (panel B) and male (panel A) hands were similar within both presentation durations tested. Error rates for short presentation durations (female: 0.38±0.04; male: 0.36±0.05) were higher than the error rates observed at longer presentation durations (female: 0.32±0.04; male: 0.32±0.04). Nonetheless, performances on all those conditions were better than chance. These data confirm the findings reported by Kovács et al. [32]: Human observers can discriminate sex from hand cues alone and, with hue and texture cues available, can do so following just brief (125 ms) presentations.

When hue and texture cues were removed, performances on female and male hands diverged. Again, error rates at short presentation durations were higher than for longer presentation durations for female hands (125 ms: 0.62±0.05; 1000 ms: 0.61±0.04) and for male hands (125 ms: 0.27±0.05; 1000 ms: 0.26±0.06). Importantly, using the 2AFC paradigm employed here, error rates increased for judgements of female hands to levels greater than chance. Simultaneously error rates decreased for judgements of male hands when hue and texture cues were removed. Together these results suggest observers found sex discrimination using silhouette hands more difficult and, faced with ambiguity, shifted their response bias (see below) to increase their rate of “male” responding.

A set of four planned orthogonal contrasts tested the differences described above. There were significantly more errors made on short presentations than on long (F1,10 = 15.18, p = .002). Similarly, there were significantly more errors made on silhouette than on colour hands (F1,10 = 29.21, p<.001). Together, those two results suggest that hue and texture information and longer viewing times are important for accurate sex discrimination when using hand cues. Most importantly, while there were no differences between error rates on female and male hands when hue and texture cues were present (F1,10 = 0.05, p = .829), there was a significant difference between error rates on female and male silhouette hands (F1,10 = 13.03, p = .004). In other words, increasing ambiguity significantly changes the patterns of sex discrimination errors when using hand stimuli: Under conditions of high ambiguity, when hand sex cues are least salient, observers tend to report seeing male hands more often than female hands. Hue and texture seem to be critical for sex discrimination cues in the absence of information about absolute size. A summary of observer performance in Experiment 1 is available as Dataset S1.

Discussion.

Experiment 1 was designed to investigate whether, when using hands as stimuli, (i) sex discrimination accuracy is mediated by presentation duration, (ii) sex discrimination accuracy is mediated by colour and texture cues, and (iii) those effects, if they exist, are equivalent for female and male hands. The data show that sex is discriminated more accurately at longer presentation durations and when colour and texture cues are available – conditions, in other words, under which sex cues are more clearly discernible. Interestingly, when cues were more ambiguous, performance on the sex discrimination task varied between female and male hands. Sex misclassification rates corresponding to female or male hands were respectively higher and lower than the 50% level expected if observers were guessing (Figure 2).

The systematically low error rates observed in the colour condition here are remarkable first of all because overt cultural sex cues (rings, nail polish, and so on) were unavailable to observers. It has been reported that infants as young as nine months of age can discriminate between adult female and male faces if sex-stereotyped cultural cues (e.g. clothing) are also visible [52]. Conversely, seven-year-old children could not visually distinguish sex from adult faces when cultural cues were minimised [37]. Clearly adult observers do not need such cues to discriminate sex at levels above chance. The low error rates in this experiment’s colour conditions are also exceptional because absolute hand size – a naturally dimorphic cue [51], [53] – was kept homogenous across stimuli. Thus, these data suggest that size is not a necessary cue and that hands can be used as an indicator of another’s sex, at least when colour and texture are available.

By comparison, performances in the absence of colour and texture cues diverged as a function of stimulus sex. Specifically, observers tended to categorise achromatic or silhouette hands more or less incorrectly depending on the hand’s sex. Female hand silhouettes were more often misjudged to be male. Male hand silhouettes were less often judged to be female (Figure 2). This result is, therefore, consistent with observations made previously using full-body and face-based stimulus sets [17], [18], [37], [38], [40] and may therefore represent a ubiquitous phenomenon. It is also, to the knowledge of these authors, the first demonstration of a male bias effect from hands that does not use either an adaptation or priming paradigm ([32] Figures 2(b) & 3 respectively).

What is not clear, from earlier reports or from this experiment, is the mechanism for that tendency. Experiment 2 was designed to explore the male bias observed in these data. Using a signal detection approach, we measured observer bias and sensitivity when discriminating sex from hand cues.

Experiment 2

Results.

Experiment 1 suggests that observers tend to report hands as looking “male” when cues that normally are sexually dimorphic are ambiguous. The aim of Experiment 2 was to explore the perceptual mechanisms mediating that bias in hands. In this experiment observers completed two signal detection tasks. In one, observers discriminated silhouette hands as female or not. In the other, observers discriminated silhouette hands as male or not. That technique makes it possible to discriminate whether the male bias manifests as the result of a difference in sensitivity to cues that signal female and male hands or from an observer bias.

It seems unlikely sensitivity differences will mediate the effect. For changes in sensitivity to be the cause, the variability of signal strength within whole-bodies, faces, and/or hands would need to differ as a function of stimulus sex. A more likely explanation for the male bias observed in Experiment 1 is a difference in observer bias. If observer bias is the mechanism driving the pattern of results reported here (and by inference in earlier studies), different response criteria should be observed for each target sex. More specifically, the pattern of responses observed in Experiment 1 can reflect either a conservative or strict criterion when assigning a target as female, a liberal or loose criterion when assigning a target as male, or by a combination of both.

Mean sensitivity performances for both target types at both presentation durations are shown in the main panel of Figure 3. Observers were able reliably to distinguish female targets from “noise”, and male targets from “noise” at both presentation durations. Most interestingly, when viewing silhouette hands observers were more sensitive at the shorter presentation duration (d’, female target = 0.58±0.17; male target = 0.66±0.15) than at the longer duration (d’, female target = 0.44±0.16; male target = 0.42±0.19). As shown in the inset panel of Figure 3, this trend persists when performance for female and male targets are averaged (d’, 125 ms = 0.62, ±0.14; 1000 ms = 0.43, ±0.14). Nonetheless, a post-hoc one-sample t test revealed that mean sensitivity at 1000 ms was greater than chance (t10 = 3.03, p = .013).

thumbnail

Figure 3. Sex classification sensitivity for ambiguous silhouette hands.

Standardised group sensitivity (d′) scores, representing the ability to distinguish target from lure trials, as a function both of target sex (female and male) and whether hands were presented for 125 ms (open circles) or 1000 ms (filled circles). Data corresponding to silhouette hand stimuli conditions are depicted, although observers were also presented colour hands. Vertical bars represent ±1 SEM.

doi:10.1371/journal.pone.0091032.g003

A one-way ANOVA shows there is no effect of stimulus sex (F1,10 = 0.04, p = 0.845), no effect of presentation duration (F1,10 = 3.37, p = 0.096), and no significant interaction between those factors (F1,10 = 0.48, p = 0.506) on sensitivity. In other words, there were no differences in observers’ sensitivities for discriminating female and male hand targets. The small decline in the sensitivity of observers across the two presentation durations was not significant here and neither did hand sex and presentation duration interact with each other in order to affect observers’ sensitivities. So, any effect of change in stimulus ambiguity on patterns of responding when discriminating female and male hands is not attributable to changes in sensitivity.

Observer bias does change, however, and those changes can explain the male bias observed in Experiment 1. As shown in the main panel of Figure 4, observers’ bias scores when searching for female targets was unaffected by presentation duration (125 ms: c = 0.46±0.23; 1000 ms: c = 0.44±0.18). Both scores are conservative in that they show a tendency to say “no” or target not present when searching for female targets. By comparison, presentation duration did affect mean bias scores when searching for male targets. Observers were more likely to say a target was male on short presentations (c, 125 ms = −0.35±0.24) than they were on longer presentations (c, 1000 ms = −0.04±0.20).

thumbnail

Figure 4. Sex classification bias.

Standardised group criterion (c) scores, representing the tendency to respond ‘target absent’ (c>0) or ‘target present’ (c<0), as a function both of target sex (female and male) and presentation duration (125 ms: open circles; 1000 ms: filled circles). The insert shows the absolute mean bias score for both female and for male targets. Observers were generally male biased at both 125 ms (cdiff = 0.81±0.28) and 1000 ms (cdiff = −0.48±0.24) presentation durations. One-sample t tests indicated that the absolute mean bias was significant at the shorter (t10 = 2.92, p = .015), but not the longer (t10 = 1.99, p = .075) exposure time. Vertical bars represent ±1 SEM.

doi:10.1371/journal.pone.0091032.g004

There was a significant difference in bias scores across the sexes such that observers were more conservative in judging female hands than male hands: Observers were more willing to say a target was “male” when searching for male targets than they were to say a target was “female” when searching for female targets (F1,10 = 6.62, p = .028). That pattern did not change as a function of presentation duration (F1,10 = 2.34, p = .157) but there was a significant interaction between target sex and presentation duration (F1,10 = 5.37, p = .043). That is, observers showed no change in bias when looking for female targets at either presentation duration (c, 125 ms = 0.46±0.23; c, 1000 ms = 0.44±0.18): Observers were consistently conservative in their attribution of a stimulus as a female target. By comparison, observers were liberal in their attributions of a stimulus as a male target. At 125 ms observers were most likely to say a stimulus was a male target (c = −0.35±0.24). As presentation duration increased to 1000 ms, observers became more conservative but were still more liberal in their target ascriptions than ever they were for females (c = −0.04±0.20). A summary of the sensitivity and bias data is available as supplementary material (see respectively Worksheet A & B, Dataset S2).

Discussion.

The aim of Experiment 2 was to explore the perceptual mechanisms mediating the male bias observed in sex discriminations of ambiguous hands. The data reported here show observers were able reliably to distinguish female targets from “noise”, and male targets from “noise” both at short (125 ms) and long (1000 ms) presentation durations when viewing silhouette hands. Importantly, there were no differences in observers’ sensitivities for female and male hand targets. There were target sex differences in response biases however. Just as reported in Experiment 1 (using a 2AFC paradigm) the data in Experiment 2 (this time using a ‘yes/no’ signal detection paradigm) show the presence of a male bias. Under the most ambiguous conditions we tested, observers were conservative in their judgements of the presence of female targets, and that conservativism was not affected by presentation duration. When judging male hands, by comparison, observers were systematically liberal in their willingness to assign a target as “male” at shorter presentation durations, becoming less liberal as presentation duration increased.

These results confirm the reliability of hands as a sex cue available to observers, even in the absence of cultural and other features. More importantly though, they demonstrate the more specific perceptual mechanisms mediating the male bias. The sex-divergent response pattern described above suggests there are implicit differences in the cost/benefit analyses applied to the consequences of potential errors when searching for each target type. One possible interpretation of those differences is that the cost of a “miss” when searching for male targets is high compared to the cost of a false alarm [2], [4], [39]. Conversely, when searching for female targets the cost of a miss is lower perhaps than a false alarm. Whether the same ratios apply for sex discriminations from whole-body and from face cues needs next to be explored further, but seems likely given the apparent ubiquity of the male bias effect.

In summary, hands, like a number of other sexual dimorphisms (e.g. [17], [18], [37], [39]) elicit in observers a male bias when normally salient dimorphic cues are ambiguous. That bias is mediated both by conservative criteria for judging a target as female and liberal criteria for judging a target as male. The effect is not mediated by sensitivity differences and seems to be a real perceptual bias.

General Discussion

The aim of the experiments reported here was to explore the proposal that there exists a multi-dimensional sex-perception space built around all available sexually dimorphic-s. One characteristic of such a space is that sex perceptions should arise not independently from cue-specific or even sense-specific processes, but should include also higher-order processes that are cue-independent. It was hypothesised that if a sex-perception space exists, the male bias that manifests during discriminations of other normally dimorphic cues [17], [18], [23], [38], [39], [40][42], [54] would also manifest for hands.

Experiment 1 established baseline performances for observers judging sex from human hands. The data show that observers could reliably discriminate an individual’s sex from their hands when colour and texture cues were present, even in the absence of absolute size cues. Performances deteriorated when presentation durations were shorter, and when colour and texture cues were removed. Nonetheless, the data were always consistent with observers making discriminations at levels not equal to chance. In particular, as stimulus ambiguity increased, sex discrimination performances diverged such that correct sex discriminations of female hands were fewer than for male hands. That is, the data reported here show evidence of a male bias when discriminating sex from ambiguous hand cues.

Experiment 2 explored the perceptual mechanisms mediating that bias. The data show that the effect does not arise from a difference in sensitivity. Instead, it arises – at least in this case of sexually ambiguous human hands – through a combination of a conservative criterion when judging targets as female and a liberal criterion when judging targets as male. The data also show that the criterion used for judging targets as female is relatively stable, while the criterion for judging targets as male is more labile.

The significance of that result lies in its implications for sex processing models. Should the same pattern of response biases eventually be shown to manifest for sex discriminations from other cues it will be strong evidence that sex discriminations are ultimately achieved via a higher-order process that is cue-independent. That would be good evidence for a multi-dimensional sex-perception space into which all cues contribute.

Already there is support that might be the case. Johnson et al. [39] found that categorisations of dimorphic cues were more often and more quickly ascribed as “male” unless the available cues were exclusively female. That is, the performances of Johnson’s categorisers were consistent with their applying very conservative criteria for discriminating stimuli as female and more liberal criteria for discriminating stimuli as male. Similarly, evidence exists for a male bias in auditory sex perceptions: Li et al. [54] examined the capacity for listeners to discriminate the sex of walkers from their footfalls. They reported observers performed the task reliably, but exhibited a tendency to identify as male the most ambiguous cases.

Neuroimaging data are also consistent. Podrebarac et al. [55] recently investigated sex discriminations using face stimuli. They found evidence that the left Fusiform Face Area (FFAl) preferentially changed its activity in response to sex but not identity repetitions. The FFA previously has been implicated in sex discriminations [56], [57] and it is likely that the FFA, probably on the left side, does contain sex-tuned neurons. In interpreting their data, Podrebarac et al. focussed on the complexity of sex discriminations and speculated that whilst FFA activation is necessary for sex discriminations, categorical sex judgements also recruit higher-level structures elsewhere in the brain. Interestingly, there is strong evidence that face-, body-, and hand-cues all mediate activity in the posterior Superior Temporal Sulcus (STSp) [58][66]. The STSp has been implicated too in the integration of biological visual and auditory cues [67], and is part of a larger network thought to mediate social perceptions [68], [69]. There is even evidence that STSp is involved in judging sex from faces [70]. If sex perceptions arise from processes that involve both the FFAl and the STSp, and no doubt other locations, it does seem likely that such perceptions are multi-dimensional. Todorov and colleagues [71], [72] have shown that in such spaces some dimensions can be orthogonal to each other, and some are not. Those relationships are yet to be mapped in the proposed sex-perception space.

One limitation of the data presented here is a lack of power to discriminate, if they exist, observer sex effects. There are reasons to expect that observer sex differences might be found in hand sex discriminations and in sensitivity to specific hand sex cues. Numbers of studies now are reporting reliable structural [73] and functional [74] differences between female and male brains and those differences carry over to performance differences in perceptual classification tasks [75]. It may be that such differences, if they exist, will be revealed as the specific sex cues mediating the effects reported here are unpacked. If so, it seems likely those effects will reflect a stimulus sex×observer sex interaction, with observers from each sex using different features to make their discriminations [76], [77].

Conclusions

In summary, data from Experiments 1 and 2 suggest multiple key findings. The first is that hands are a useful sex cue, even when degraded. The second is that the male bias appears to be a ‘real’ effect, manifesting when normally sexually dimorphic cues are ambiguous and from different stimuli. The third is that the male bias manifests, at least for hands, from criterion differences between female and male cues: Observers apply a conservative criterion when judging cues as signalling female, and a liberal criterion when judging cues as signalling male. With other data these findings begin to build a picture of a multi-dimensional sex-perception space.

Supporting Information

Dataset S1.

Sex classification error rates and summary of analyses (Experiment 1).

doi:10.1371/journal.pone.0091032.s001

(XLSX)

Dataset S2.

Sex classification sensitivity and bias scores and summary of analyses (Experiment 2). The dataset contains two worksheets: Click the ‘Worksheet A’ tab to view a summary of the sensitivity data, and click the ‘Worksheet B’ tab to view a summary of the bias data.

doi:10.1371/journal.pone.0091032.s002

(XLSX)

Acknowledgments

The authors wish to thank Dr. Steve Provost for his comments on statistical methodology.

Author Contributions

Conceived and designed the experiments: JG RvdZ AB. Performed the experiments: JG. Analyzed the data: JG. Contributed reagents/materials/analysis tools: DB. Wrote the paper: JG RvdZ AB.

References

  1. 1. Stangor C, Lynch L, Duan C, Glass B (1992) Categorization of individuals on the basis of multiple social features. J Pers Soc Psychol 62: 207–218. doi: 10.1037//0022-3514.62.2.207
  2. 2. Brooks A, Schouten B, Troje NF, Verfaillie K, Blanke O, et al. (2008) Correlated changes in perceptions of the gender and orientation of ambiguous biological motion figures. Curr Biol 18: R728–R729. doi: 10.1016/j.cub.2008.06.054
  3. 3. Maner JK, Gailliot MT, Rouby DA, Miller SL (2007) Can’t take my eyes off you: Attentional adhesion to mates and rivals. J Pers Soc Psychol 93: 389–401. doi: 10.1037/0022-3514.93.3.389
  4. 4. Plant EA, Goplen J, Kunstman JW (2011) Selective responses to threat: The roles of race and gender in decisions to shoot. Pers Soc Psychol Bull 37: 1274–1281. doi: 10.1177/0146167211408617
  5. 5. Navarrete CD, Olsson A, Ho AK, Mendes WB, Thomsen L, et al. (2009) Fear extinction to an out-group face: The role of target gender. Psychol Sci 20: 155–158. doi: 10.1111/j.1467-9280.2009.02273.x
  6. 6. Millar JA, Accioly JM (1996) Measurement of blood pressure may be affected by an interaction between subject and observer based on gender. J Hum Hypertens 10: 449–453.
  7. 7. Toups MA, Kitchen A, Light JE, Reed DL (2011) Origin of clothing lice indicates early clothing use by anatomically modern humans in Africa. Mol Biol Evol 28: 29–32. doi: 10.1093/molbev/msq234
  8. 8. Bigoni L, Velemínská J, Brůžek J (2010) Three-dimensional geometric morphometric analysis of cranio-facial sexual dimorphism in a Central European sample of known sex. Homo 61: 16–32. doi: 10.1016/j.jchb.2009.09.004
  9. 9. Burton AM, Bruce V, Dench N (1993) What’s the difference between men and women? Evidence from facial measurement. Perception 22: 153–176. doi: 10.1068/p220153
  10. 10. Ferrario VF, Sforza C, Miani A, Tartaglia G (1993) Craniofacial morphometry by photographic evaluations. Am J Orthod Dentofacial Orthop 103: 327–337. doi: 10.1016/0889-5406(93)70013-e
  11. 11. Velemínská J, Bigoni L, Krajicek V, Borsky J, Smahelova D, et al. (2012) Surface facial modelling and allometry in relation to sexual dimorphism. Homo 63: 81–93. doi: 10.1016/j.jchb.2012.02.002
  12. 12. Walker PL (2008) Sexing skulls using discriminant function analysis of visually assessed traits. Am J Phys Anthropol 136: 39–50. doi: 10.1002/ajpa.20776
  13. 13. Bachorowski JA, Owren MJ (1999) Acoustic correlates of talker sex and individual talker identity are present in a short vowel segment produced in running speech. J Acoust Soc Am 106: 1054–1063. doi: 10.1121/1.427115
  14. 14. Penn DJ, Oberzaucher E, Grammer K, Fischer G, Soini HA, et al. (2007) Individual and gender fingerprints in human body odour. J R Soc Interface 4: 331–340. doi: 10.1098/rsif.2006.0182
  15. 15. Barclay CD, Cutting JE, Kozlowski LT (1978) Temporal and spatial factors in gait perception that influence gender recognition. Percept Psychophys 23: 145–152. doi: 10.3758/bf03208295
  16. 16. Bruce V, Burton AM, Hanna E, Healey P, Mason O, et al. (1993) Sex discrimination: How do we tell the difference between male and female faces? Perception 22: 131–152. doi: 10.1068/p220131
  17. 17. Davidenko N (2007) Silhouetted face profiles: A new methodology for face perception research. J Vis 7: 1–17. doi: 10.1167/7.4.6
  18. 18. Troje NF, Sadr J, Geyer H, Nakayama K (2006) Adaptation aftereffects in the perception of gender from biological motion. J Vis 6: 850–857. doi: 10.1167/6.8.7
  19. 19. Yamaguchi MK, Hirukawa T, Kanazawa S (1995) Judgment of gender through facial parts. Perception 24: 563–575. doi: 10.1068/p240563
  20. 20. Freeman JB, Ambady N (2011b) When two become one: Temporally dynamic integration of the face and voice. J Exp Soc Psychol 47: 259–263. doi: 10.1016/j.jesp.2010.08.018
  21. 21. Hacker G, Brooks A, van der Zwan R (2013) Sex discriminations made on the basis of ambiguous visual cues can be affected by the presence of male sweat. BMC Psychology 1: doi:10.1186/2050-7283-1-10.
  22. 22. Kovács G, Gulyás B, Savic I, Perrett DI, Cornwell RE, et al. (2004) Smelling human sex hormone-like compounds affects face gender judgment of men. Neuroreport 15: 1275–1277. doi: 10.1097/01.wnr.0000130234.51411.0e
  23. 23. van der Zwan R, MacHatch C, Kozlowski D, Troje N, Blanke O, et al. (2009) Gender bending: auditory cues affect visual judgements of gender in biological motion displays. Exp Brain Res 198: 373–382. doi: 10.1007/s00221-009-1800-y
  24. 24. Lippa R (1983) Sex typing and the perception of body outlines. J Pers 51: 667–682. doi: 10.1111/j.1467-6494.1983.tb00873.x
  25. 25. Thompson SK, Bentler PM (1971) The priority of cues in sex discrimination by children and adults. Dev Psychol 5: 181–185. doi: 10.1037/h0031427
  26. 26. Kozlowski LT, Cutting JE (1977) Recognising the sex of a walker from a dynamic point-light display. Percept Psychophys 21: 575–580. doi: 10.3758/bf03198740
  27. 27. Mather G, Murdoch L (1994) Gender discrimination in biological motion displays. Proc R Soc B 258: 273–279. doi: 10.1098/rspb.1994.0173
  28. 28. Troje NF (2002) Decomposing biological motion: A framework for analysis and synthesis of human gait patterns. J Vis 2: 371–387. doi: 10.1167/2.5.2
  29. 29. Troje NF, Szabo S (2006) Why is the average walker male? J Vis 6: doi:10.1167/6.6.1034.
  30. 30. Davidenko N, Witthoft N, Winawer J (2008) Gender aftereffects in face silhouettes reveal face-specific mechanisms. Vis Cogn 16: 99–103. doi: 10.1167/7.9.883
  31. 31. Little AC, DeBruine LM, Jones BC (2005) Sex-contingent face after-effects suggest distinct neural populations code male and female faces. Proc R Soc B 272: 2283–2287. doi: 10.1098/rspb.2005.3220
  32. 32. Kovács G, Zimmer M, Banko E, Harza I, Antal A, et al. (2006) Electrophysiological correlates of visual adaptation to faces and body parts in humans. Cereb Cortex 16: 742–753. doi: 10.1093/cercor/bhj020
  33. 33. Jordan H, Fallah M, Stoner GR (2006) Adaptation of gender derived from biological motion. Nat Neurosci 9: 738–739. doi: 10.1038/nn1710
  34. 34. Webster MA, Kaping D, Mizokami Y, Duhamel P (2004) Adaptation to natural facial categories. Nature 428: 557–561. doi: 10.1038/nature02420
  35. 35. Davidenko N, Winawer J, Witthoft N (2006) Gender aftereffects in the perception of silhouetted face profiles. J Vis 6: 1068. doi: 10.1167/6.6.1068
  36. 36. DeBruine LM, Welling LLM, Jones BC, Little AC (2010) Opposite effects of visual versus imagined presentation of faces on subsequent sex perception. Vis Cogn 18: 816–828. doi: 10.1080/13506281003691357
  37. 37. Wild HA, Barrett SE, Spence MJ, O’Toole AJ, Cheng YD, et al. (2000) Recognition and sex categorization of adults’ and children’s faces: Examining performance in the absence of sex-stereotyped cues. J Exp Child Psychol 77: 269–291. doi: 10.1006/jecp.1999.2554
  38. 38. Armann R, Bülthoff I (2012) Male and female faces are only perceived categorically when linked to familiar identities - And when in doubt, he is a male. Vision Res 63: 69–80. doi: 10.1016/j.visres.2012.05.005
  39. 39. Johnson KL, Iida M, Tassinary LG (2012) Person (mis)perception: Functionally biased sex categorization of bodies. Proc R Soc B 279: 4982–4989. doi: 10.1098/rspb.2012.2060
  40. 40. Nagy E, Nemeth E, Molnar P (2000) From unidentified to ‘misidentified’ newborn: Male bias in recognition of sex. Percept Mot Skills 90: 102–104. doi: 10.2466/pms.2000.90.1.102
  41. 41. Hildebrandt KA, Fitzgerald HE (1977) Gender bias in observers’ perceptions of infants’ sex: It’s a boy most of the time. Percept Mot Skills 45: 472–474. doi: 10.2466/pms.1977.45.2.472
  42. 42. Cellerino A, Borghetti D, Sartucci F (2004) Sex differences in face gender recognition in humans. Brain Res Bull 63: 443–449. doi: 10.1016/j.brainresbull.2004.03.010
  43. 43. Leopold DA, O’Toole AJ, Vetter T, Blanz V (2001) Prototype-referenced shape encoding revealed by high-level aftereffects. Nat Neurosci 4: 89.
  44. 44. Valentine T (1991) A unified account of the effects of distinctiveness, inversion, and race in face recognition. Q J Exp Psychol A 43: 161–204. doi: 10.1080/14640749108400966
  45. 45. Rhodes G, Jaquet E (2011) Aftereffects reveal that adaptive face-coding mechanisms are selective for race and sex. In: Adams RB, Jr, Ambady N, Nakayama K, Shimojo S, editors. The Science of Social Vision. New York, NY: Oxford University Press. pp. 347–362.
  46. 46. Thompson P, Burr D (2009) Visual aftereffects. Curr Biol 19: R11–R14. doi: 10.1016/j.cub.2008.10.014
  47. 47. Javadi AH, Wee N (2012) Cross-category adaptation: Objects produce gender adaptation in the perception of faces. PLOS ONE 7: e46079. doi: 10.1371/journal.pone.0046079
  48. 48. Ghuman AS (2010) Face adaptation without a face. Curr Biol 20: 32–36. doi: 10.1016/j.cub.2009.10.077
  49. 49. Winer BJ (1962) Statistical principles in experimental design. New York, NY: McGraw-Hill. 672 p.
  50. 50. Stanislaw H, Todorov N (1999) Calculation of signal detection theory measures. Behav Res Methods Instrum Comput 31: 137–149. doi: 10.3758/bf03207704
  51. 51. Napier J (1993) Structure of the hand. In: Tuttle RH, editor. Hands. Princeton, NJ: Princeton University Press. pp. 13–54.
  52. 52. Leinbach MD, Fagot BI (1993) Categorical habituation to male and female faces: Gender schematic processing in infancy. Infant Behav Dev 16: 317–332. doi: 10.1016/0163-6383(93)80038-a
  53. 53. Agnihotri AK, Purwar B, Jeebun N, Agnihotri S (2006) Determination of sex by hand dimensions. Internet J Forensic Sci 1: 1–1. doi: 10.1016/j.jflm.2008.03.002
  54. 54. Li X, Logan RJ, Pastore RE (1991) Perception of acoustic source characteristics: Walking sounds. J Acoust Soc Am 90: 3036–3049. doi: 10.1121/1.401778
  55. 55. Podrebarac SK, Goodale MA, van der Zwan R, Snow JC (2013) Gender-selective neural populations: Evidence from event-related fMRI repetition suppression. Exp Brain Res 226: 241–252. doi: 10.1007/s00221-013-3429-0
  56. 56. Freeman JB, Rule NO, Adams RB, Ambady N (2010) The neural basis of categorical face perception: Graded representations of face gender in fusiform and orbitofrontal cortices. Cereb Cortex 20: 1314–1322. doi: 10.1093/cercor/bhp195
  57. 57. Ng M, Ciaramitaro VM, Anstis S, Boynton GM, Fine I (2006) Selectivity for the configural cues that identify the gender, ethnicity, and identity of faces in human cortex. Proc Natl Acad Sci U S A 103: 19552–19557. doi: 10.1073/pnas.0605358104
  58. 58. Thompson JC, Hardee JE, Panayiotou A, Crewther D, Puce A (2007) Common and distinct brain activation to viewing dynamic sequences of face and hand movements. Neuroimage 37: 966–973. doi: 10.1016/j.neuroimage.2007.05.058
  59. 59. Penton LG, Fernandez AP, Leon MAB, Ymas YA, Garcia LG, et al. (2010) Neural activation while perceiving biological motion in dynamic facial expressions and point-light body action animations. Neural Regeneration Research 5: 1076–1083.
  60. 60. Chao LL, Martin A, Haxby JV (1999) Are face-responsive regions selective only for faces? Neuroreport 10: 2945–2950. doi: 10.1097/00001756-199909290-00013
  61. 61. Morris JP, Pelphrey KA, McCarthy G (2006) Occipitotemporal activation evoked by the perception of human bodies is modulated by the presence or absence of the face. Neuropsychologia 44: 1919–1927. doi: 10.1016/j.neuropsychologia.2006.01.035
  62. 62. Pelphrey KA, Morris JP, Michelich CR, Allison T, McCarthy G (2005) Functional anatomy of biological motion perception in posterior temporal cortex: An fMRI study of eye, mouth and hand movements. Cereb Cortex 15: 1866–1876. doi: 10.1093/cercor/bhi064
  63. 63. Allison T, Puce A, McCarthy G (2000) Social perception from visual cues: role of the STS region. Trends Cogn Sci 4: 267–278. doi: 10.1016/s1364-6613(00)01501-1
  64. 64. Grèzes J, Costes N, Decety J (1999) The effects of learning and intention on the neural network involved in the perception of meaningless actions. Brain 122: 1875–1887. doi: 10.1093/brain/122.10.1875
  65. 65. Jellema T, Baker C, Perrett D, Wicker B (2000) Neural representation for the perception of the intentionality of hand actions. Int J Psychol 35: 205–205. doi: 10.1006/brcg.2000.1231
  66. 66. Bonda E, Petrides M, Ostry D, Evans A (1996) Specific involvement of human parietal systems and the amygdala in the perception of biological motion. J Neurosci 16: 3737–3744.
  67. 67. Bidet-Caulet A, Voisin J, Bertrand O, Fonlupt P (2005) Listening to a walking human activates the temporal biological motion area. Neuroimage 28: 132–139. doi: 10.1016/j.neuroimage.2005.06.018
  68. 68. Puce A, Perrett D (2003) Electrophysiology and brain imaging of biological motion. Phil Trans R Soc B 358: 435–445. doi: 10.1098/rstb.2002.1221
  69. 69. Atkinson AP, Adolphs R (2011) The neuropsychology of face perception: beyond simple dissociations and functional selectivity. Phil Trans R Soc B 366: 1726–1738. doi: 10.1098/rstb.2010.0349
  70. 70. Dzhelyova MP, Ellison A, Atkinson AP (2011) Event-related repetitive TMS reveals distinct, critical roles for right OFA and bilateral posterior STS in judging the sex and trustworthiness of faces. J Cogn Neurosci 23: 2782–2796. doi: 10.1162/jocn.2011.21604
  71. 71. Todorov A, Mende-Siedlecki P, Dotsch R (2013) Social judgments from faces. Curr Opin Neurobiol 23: 373–380. doi: 10.1016/j.conb.2012.12.010
  72. 72. Todorov A, Said CP, Engell AD, Oosterhof NN (2008) Understanding evaluation of faces on social dimensions. Trends Cogn Sci 12: 455–460. doi: 10.1016/j.tics.2008.10.001
  73. 73. Wang L, Shen H, Tang F, Zang Y, Hu D (2012) Combined structural and resting-state functional MRI analysis of sexual dimporhism in the young adult human brain: An MVPA approach. Neuroimage 61: 931–940. doi: 10.1016/j.neuroimage.2012.03.080
  74. 74. Canli T, Desmond JE, Zhao Z, Gabrieli JDE (2002) Sex differences in the neural basis of emotional memories. Proc Natl Acad Sci U S A 99: 10789–10794. doi: 10.1073/pnas.162356599
  75. 75. Schouten B, Troje NF, Brooks A, van der Zwan R, Verfaillie K (2010) The facing bias in biological motion perception: Effects of stimulus gender and observer sex. Atten Percept Psychophys 75: 1256–1260. doi: 10.3758/app.72.5.1256
  76. 76. Hewig J, Trippe RH, Hecht H, Straube T, Miltner WHR (2008) Gender differences for specific body regions when looking at men and women. J Nonverbal Behav 32: 67–78. doi: 10.1007/s10919-007-0043-5
  77. 77. Johnson KL, Tassinary LG (2005) Perceiving sex directly and indirectly: Meaning in motion and morphology. Psychol Sci 16: 890–897. doi: 10.1111/j.1467-9280.2005.01633.x