Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Neural Responses to Truth Telling and Risk Propensity under Asymmetric Information

  • Hideo Suzuki,

    Affiliation Laureate Institute for Brain Research, Tulsa, OK, United States of America

  • Masaya Misaki,

    Affiliation Laureate Institute for Brain Research, Tulsa, OK, United States of America

  • Frank Krueger,

    Affiliation Molecular Neuroscience Department, George Mason University, Fairfax, VA, United States of America

  • Jerzy Bodurka

    jbodurka@laureateinstitute.org

    Affiliations Laureate Institute for Brain Research, Tulsa, OK, United States of America, College of Engineering, University of Oklahoma, Norman, OK, United States of America

Abstract

Trust is multi-dimensional because it can be characterized by subjective trust, trust antecedent, and behavioral trust. Previous research has investigated functional brain responses to subjective trust (e.g., a judgment of trustworthiness) or behavioral trust (e.g., decisions to trust) in perfect information, where all relevant information is available to all participants. In contrast, we conducted a novel examination of the patterns of functional brain activity to a trust antecedent, specifically truth telling, in asymmetric information, where one individual has more information than others, with the effect of varying risk propensity. We used functional magnetic resonance imaging (fMRI) and recruited 13 adults, who played the Communication Game, where they served as the “Sender” and chose either truth telling (true advice) or lie telling (false advice) regarding the best payment allocation for their partner. Our behavioral results revealed that subjects with recreational high risk tended to choose true advice. Moreover, fMRI results yielded that the choices of true advice were associated with increased cortical activation in the anterior rostral medial and frontopolar prefrontal cortices, middle frontal cortex, temporoparietal junction, and precuneus. Furthermore, when we specifically evaluated a role of the bilateral amygdala as the region of interest (ROI), decreased amygdala response was associated with high risk propensity, regardless of truth telling or lying. In conclusion, our results have implications for how differential functions of the cortical areas may contribute to the neural processing of truth telling.

Introduction

Trust is characterized by three constructs: subjective trust, behavioral trust, and trust antecedents [1]. Subjective trust is an internal state of cognitive and social processing of trust (e.g., a perception/evaluation of others as trustworthy or not), which results from a trust antecedent (a psychological precursor leading to trust) and in behavioral trust (an overt action reflecting trust). The interplay of these constructs directly or indirectly optimizes group performance [27] and helps accomplish social capital in practical settings, such as the treatment success in psychotherapy [8, 9], therefore it is important to study subjective trust, behavioral trust, and trust antecedents.

Subjective trust may be determined by social and cognitive processing in specific brain areas. For example, the amygdala is involved in trustworthiness judgments of neutral faces [1015]. The anterior insular cortex and dorsal anterior cingulate cortex, contributing to the salience network [16, 17], may also serve our ability to perceive salient, affective tones in communication and to preconceive trustworthiness. Furthermore, the dorsolateral prefrontal cortex and posterior cingulate cortex, involved in the central-executive network [16], may alternatively serve to scrutinize trustworthiness.

Behavioral trust (e.g., decisions to trust) has also been examined in the context of an economic exchange paradigm. In investment games, [18], subjects show increased functional activation in the caudate head during their repayments to a benevolent investor relative to a malevolent investor [19] and monetary gain following their investment [20]. Moreover, damage to the ventromedial prefrontal cortex [21] and amygdala [22, 23] is associated with a benevolent type of investor strategy (i.e., investing more) for their partner, suggesting the important functions of the ventromedial prefrontal cortex and amygdala in social vigilance and prospection in decisions to trust. Furthermore, in voluntary trust games, the decisions to trust as the first move is related to increased activation in the paracingulate cortex/anterior rostral medial prefrontal cortex (armPFC), and septal area [24]. With the same data, further analysis shows that the armPFC and anterior insular cortex are involved in shared neurocircuitry of decisions to trust and reciprocate, while the frontopolar cortex (fpPFC) and temporoparietal junction (TPJ) are exclusively associated with decisions to initiate trusting [25]. These findings indicate that behavioral trust may be associated with functions of the caudate, amygdala, and some cortical areas.

While previous literature had addressed the patterns of brain activity concerning subjective trust and behavioral trust, to our knowledge, few studies have focused on neural antecedents of trust, especially under asymmetric information. Asymmetric information refers to an economic/social communication where one participant has better or superior information than the others [2628], as opposed to perfect information, where relevant information is equally available to all participants. The most important trust antecedent under asymmetric information is truth telling behavior initiated by those who have better information [26]. For example, in a psychotherapeutic interaction, a mental health practitioner, who is knowledgeable of the diagnosis and treatment of mental health, communicates with a patient, who needs professional advice and help. If a patient realizes that the mental health practitioner tells the truth and is convinced of the appropriateness of the practitioner’s treatment, the patient would be willing to follow the practitioner’s advice and stay in the treatment for a sufficiently long time, which builds therapeutic alliance and promotes the efficiency of interpersonal therapeutic treatment [8, 9, 2931] and medication treatment [32, 33]. In this way, well informed participant’s truth telling under asymmetric information is crucial to precede trust (i.e., trust antecedent), and the current study aimed to examine neural biomarkers for truth telling. Notably, while previous studies have mentioned functional brain activity to decisions to trust [24, 25], which may be similar to brain activity in a less informed participant who needs to decide either to trust or not under asymmetric information, few studies have addressed brain functions in a well-informed participant’s truth telling over lying. Although one relevant study has shown that cognitive functions of the anterior cingulate cortex, dorsolateral prefrontal cortex, and ventromedial prefrontal cortex are involved in lying behavior under impersonal context [34], functional brain activity during truth telling under asymmetric information has not been well investigated.

In addition to truth telling, risk propensity might also serve as a trust antecedent in asymmetric information, because risk propensity is essential to determine whether a well informed participant overcomes potential risks for being exploited [35] and betrayed [36] following truth telling. Therefore, the present study also focused on functional brain activity in relation to risk propensity.

Therefore, the aims of our study were to examine neuro-biomarkers for the interaction between truth telling and risk propensity under asymmetric information, which would be important to advance our understanding of how interpersonal trust is formed. To manipulate asymmetric information, we used the Communication Game, designed as a paradigm to assess truth-telling behavior in asymmetric information [26, 37, 38]. In the Communication Game, one player, called the Sender, is informed of three payoff options—one leading to a high payoff for oneself but a medium payoff for the partner, another one to a medium payoff for oneself but a high payoff for the partner, and the other one to low payoffs for both—and decides to tell the partner either the truth (“true advice”) or lie (“false advice”) regarding the high payoff for the partner. For analyzing neuro-biomarkers, we used functional magnetic resonance imaging (fMRI) and performed whole-brain analysis. In addition to the whole-brain analysis, we specifically focused on amygdala activity because the amygdala might commonly active in both trust-related behavior and risk-taking behavior. That is, amygdala functions contribute to social processing of evaluating trustworthiness of faces in healthy subjects [1015], and amygdala activity to ratings of trustworthiness was reduced in patients with schizophrenia and autistic spectrum disorder [39, 40], who often show difficulty in social judgment, including trustworthiness [4143]. Although the social evaluation of trustworthiness is not equivalent to truth telling, it would be intriguing to examine whether amygdala functions extend to the psychological processing of truth telling. On the other hand, the amygdala is also associated with vigilance [12, 44, 45] and risk-taking behavior [46, 47]. For instance, patients with substance abuse [48, 49] show reduced amygdala activity to risky decision-making as compared to healthy controls. This suggests that their amygdala may be insensitive to risk decision making, encouraging the psychiatric patients to show risk-taking behavior. Because of possible relations of truth telling and risk propensity with amygdala functions, the present study focused on the amygdala as the region-of-interest (ROI) analysis.

We hypothesized that (1) the Sender with high risk propensity would tell the truth more often than one with low risk propensity, because high risk-takers tend to behave in trusting manners [1, 50] and (2) the Sender would show functional brain changes, especially the amygdala, during the choices of true advice (relative to false advice) in asymmetric information, while this relationship depended on the degree of risk propensity.

Methods

Participants

The study was conducted at the Laureate Institute for Brain Research. The research protocol was approved by the Western Institutional Review Board. Human research in this study was conducted according to the principles expressed in the Declaration of Helsinki. All fourteen healthy adults were recruited from Tulsa metro area. Study subjects gave written informed consent to participate and received financial compensation. None of them showed any clinically significant physical illness, a history of traumatic brain injury, severe vision/hearing loss, or any Axis I psychiatric disorder based on the Structured Clinical Interview for DSM-IV-TR Axis I Disorders (SCID-I/NP) [51]. One of them was, however, excluded because we faced a technical problem when this subject was scanned. As a result, a total of N = 13 subjects (8 female), whose age ranged from 21 to 31, were included in the data analyses. Table 1 shows some background characteristics of our sample. After the study, the subjects received financial compensation for their study participation.

thumbnail
Table 1. Background Characteristics of the Sample (N = 13).

https://doi.org/10.1371/journal.pone.0137014.t001

Risk propensity

Risk Taking Inventory (RTI) was used to assess the frequency of risk-taking behavior in everyday life, based on a five-point scale (1 = never; 5 = very often) [52]. The RTI is a self-report questionnaire, where six dimensions of each of current and past risk-taking behaviors during adulthood were measured. These dimensions included recreational risks (e.g., rock-climbing, scuba diving), health risks (e.g., smoking, poor diet, high alcohol consumption), career risks (e.g., quitting a job without another to go to), financial risks (e.g., gambling, risky investments), safety risks (e.g., fast driving, city cycling without a helmet), and social risks (e.g., standing for election, publicly challenging a rule or decision). The total score and each subscore were 60 and 10 at maximum, respectively, and a higher score indicated higher risk propensity.

Communication Game

As the fMRI task, the Communication Game was used to manipulate sequential situations where individuals faced conflicts between their own financial gain and their partner’s gain in asymmetric information [26, 37, 38]. The Communication Game was programmed using the Willow experimental economics software framework (George Mason University, Fairfax, VA). In this game, a pair of two players interacted with each other to determine payoffs to each player. In the present study, although subjects were informed that they would interact with either human or computer-programming player, the player partner was actually performed by computer programming across all trials; the player partner was programmed to choose true/false advice or to follow/disregard advice with varying probabilities, depending on subjects’ response at the previous trial (for details about the probabilities, see S1 Table). After the MRI study, all subjects were debriefed and informed that they indeed interacted with a computer-programming player. Fig 1A illustrates the flow of each Communication Game trial. Each subject played the role of either the Sender or Receiver; when the subject was assigned to one of the roles, the partner (programmed by computer) was automatically assigned to the other role. These roles were switched between two players across trials.

thumbnail
Fig 1. Experimental design.

(A) Timeline for a single Communication Game. A subject was assigned to either the Sender or Receiver. The Sender viewed three payment pairs and was instructed to choose true advice (the pair allocating the most money to the Receiver) or false advice (the pairs allocating less money to the Receiver) for the Receiver. In contrast, the Receiver could not view the payment pairs. Instead, the Receiver was instructed to choose one pair based on the Sender’s advice. The Receiver’s choice determined the final allocation of money to each player. Finally, both the Sender and Receiver could view information about whether their partner was trustworthy or trusting. (B) An example of three payment pairs which the Sender might view during a Communication Game trial. In this example, the Sender (S) delivered true advice if she/he chose option B; the Sender delivered false advice if she/he chose options A or C. Then, the Sender’s choice of advice was presented to the Receiver (R), and the Receiver did follow (“F”) or did not follow (“NF”) the advice. The Receiver’s choice determined how much S and R gained.

https://doi.org/10.1371/journal.pone.0137014.g001

When subjects were designated to the Sender, they were presented with three payoff pairs each of which indicated how much was allocated to each player (see Fig 1B as an example). These pairs included (A) $0.20 given to oneself and $0.15 given to the partner, (B) $0.15 given to oneself and $0.25 given to the partner, and (C) $0.10 to given to oneself and the partner, although these pairs were shuffled every trial (e.g., the pair of $0.20 and $0.15 belongs to option A sometimes and option B or C other times). The Sender was instructed to choose one best pair allocating the most money to the partner (i.e., Receiver) within 6 sec (e.g., option B in Fig 1B). However, the Sender was also instructed that she/he could try to deceive the partner to gain more money for her-/himself (e.g., option A in Fig 1B) or to make even allocation (e.g., option C in Fig 1B). That is, in such a conflict situation, the Sender could deliver true advice (i.e., telling a truth) or false advice (i.e., telling a lie) to the partner. Note that the pair of $0.10 and $0.10 was included because it rules out the possibility that the Sender tactically chose true advice to spike the partner’s expected choice of not following the advice and to gain her/his own benefit, rather than choosing true advice as being honest for the partner [26].

In contrast, when subjects were designated as the Receiver, they waited for 6 sec during which the partner (i.e., Sender) chose true or false advice. Then the Receiver was presented with a message showing which payoff was advised by the partner. For example, if the partner selected option B, the Receiver viewed a message, “You receive the most money with B.” Note that the information about the payoff pairs was not disclosed to the Receiver. In this way, the Sender had more pieces of information (i.e., available payoff pairs) than the Receiver, manipulating information asymmetry between two players. The Receiver had to choose one option based on only the partner’s advice within 6 sec. If the Receiver trusted the partner, the Receiver was expected to choose the option according to the advice; otherwise, the Receiver would choose the option against the advice.

Importantly, the Receiver’s choice determined the final payoff. Then, both the Sender and Receiver reviewed their partner’s choice behavior. That is, the Sender was informed whether or not the Receiver followed her/his advice. On the other hand, the Receiver was informed whether she/he gained the most money. When the next trial started, a subject’s role (either Sender or Receiver) was determined randomly, and the payoff pairs were shuffled across options A, B, and C.

Although we collected data of subjects’ responses as both the Sender and Receiver, the present study focused on only subjects’ responses as the Sender. This was because the present study specifically aimed to examine functional brain responses to the choices of true advice/truth telling as a trust antecedent in asymmetric information.

Procedure and MRI data acquisition

Prior to scans, subjects were asked to complete the RTI and then given instruction of how to play the Communication Game with their partner. They were also told that they could get money based on their task performances, although all subjects ultimately received $20.00 regardless of their actual performances. Some subjects met a confederate face-to-face to make them believe that a potential player partner existed, but other subjects did not meet anyone. Although our analyses mixed these subjects, a previous study revealed that functional brain responses during economic exchange games were not affected by a prior personal encounter with a player partner [53].

Neuroimaging data were acquired on Discovery MR750 3 Tesla MRI whole-body MRI scanner (General Electric Healthcare Technologies, Waukesha, WI) equipped with 32-channel MRI receiver capable of conducting, in real time, fMRI with parallel imaging such as Sensitivity Encodings (SENSE) [54]. An 8-element receive-only head coil array was used for fMRI signal reception. During each fMRI scan, physiological cardiac and respiratory waveforms were simultaneously acquired (40 Hz sampling rate). The cardiac waveform was measured using a photoplethysmograph pad with an infra-red emitter, pulse oximetry from the subject’s left index finger; respiration waveform was measured using a pneumatic respiration belt. MRI session involved a localizer scan for prescribing the following anatomical and functional scans, a 5-minute anatomical scan for localizing and aligning functional scans, a 7.5-minute resting-state functional scan, and four 9-minute Communication-Game-related functional scans. For the anatomical scan, one three-dimensional T1-weighted magnetization prepared rapid gradient echo (MPRAGE) scan with SENSE (TR/TE = 5/1.92 ms, inversion/delay time TI/TD = 725/1400 ms, flip angle = 8°, FOV = 240 mm, axial slices per slab = 128, image matrix = 256 × 256, voxel volume = 0.94 × 0.94 × 1.20 mm3, acceleration factor R = 2, sampling band-width = 31.3 kHz) was acquired in the axial plane. For the functional scans, blood-oxygen-level-dependent (BOLD) images were acquired with a T2*-weighted single-shot gradient-recalled echo-planer imaging (EPI) sequence (TR/TR = 2000/25 ms, flip angle = 78°, FOV = 240 mm, acquisition matrix = 96 × 96 reconstructed to image matrix of 128 × 128, voxel volume = 1.875 × 1.875 × 2.900 mm3, axial slices per volume = 34, number of volumes = 263, SENSE acceleration factor R = 2 in the phase encoding (anterior-posterior commissure plane) direction, sampling bandwidth = 250kHz).

Each of Communication Game functional scans consisted of 24 trials; subjects were assigned to the Sender in 12 trials and the Receiver in the other 12 trials. Since our focus was to analyze functional brain responses to the choices of true advice, our data analyses used only imaging data acquired during Communication Game trials where subjects acted as the Sender.

The total amount of time for MRI scans was less than 2 hours. During all functional scans, simultaneous electroencephalography (EEG) recordings were additionally performed using a 32-channel MR-compatible EEG system (Brain Products GmbH), although EEG data were not used in the present study.

fMRI data preprocessing

Analysis of Functional Neuroimages (AFNI) [55] was used for fMRI data analyses. The first four volumes in each scan were excluded from the data analysis to avoid T1 equilibrium effect. Physiological noise correction was conducted to suppress cardiorespiratory signal modulations by utilizing the cardiac and respiratory waveforms recorded during scans and employing the RETROICOR [56] implementation in AFNI. Further, slice timing correction and volume registration to the first volume were applied. The EPI images were spatially transformed to the Talairach and Tornoux [57] template brain using Advanced Normalization Tools (ANTS, http://picsl.upenn.edu/software/ants/) with the Symmetric Normalization (SyN) method [58]. SyN is an algorithm for applying a bi-directional diffeomorphism, with maximizing a cross-correlation metric. The normalized image was resampled to 1.875 mm3 isotropic voxel. Spatial smoothing was applied by convolving a 4.0mm-based full width at half maximum (FWHM) Gaussian kernel. The signal time course was scaled to percent change relative to the mean signal across time in each voxel. General linear model (GLM) analysis was conducted to evaluate functional brain activation. The design matrix included modeled responses for showing a role assignment image, the Sender’s decision-making, wait as the Sender, outcome message for the Sender, the Receiver’s decision-making, wait as the Receiver, and outcome message for the Receiver (see Fig 1A). The response models were constructed by convolving boxcar functions for each event time course with a Gamma function model of hemodynamic response. In order to incorporate within-subject response variability into our group analysis, we estimated the trial-wise response of the Sender’s decision-making by modeling each trial response independently [59]. This yielded beta series of trial-wise responses of the Sender’s decision-making. In addition to these task-related regressors, six motion parameters (shifts in x, y, and z directions and roll, pitch, yaw rotations), their temporal derivatives, and 4th-order polynomial regressors for modeling slow frequency noises were included in the design matrix.

Statistical analysis

For the behavioral analysis (S1 Dataset), we first checked whether the number of choosing true advice was related to age (using Spearman correlation), gender, and educational levels (using one-way ANOVA). Then Spearman’s rank correlation was used to test the associations between the total RTI score and the number of instances choosing true advice. In addition, the other Spearman correlations between each of the RTI subscores (i.e., recreational risks, health risks, career risks, financial risks, safety risks, and social risks) and the number of instances choosing true advice was employed, with Bonferroni-typed adjustments for multiple comparisons.

For fMRI analysis (S2 Dataset), we employed linear mixed-effects models (LME) analysis [60] and treated subject variability as a random-effect variable in the R statistical computing language and environment [61]. The beta series of the Sender’s decision-making activation was entered as a dependent variable. The advice choice (the choices of true advice relative to false advice), the standardized RTI score, and their interaction were entered as fixed effects and subject variability as a random effect. The statistical parametric map was thresholded with voxel-wise p < 0.005 and then cluster size ≥ 65 with first-nearest neighbor clustering for family-wise error correction. Cluster size threshold was determined by a Monte-Carlo simulations using 3dClustSim in AFNI with smoothness of 5.31 × 5.43 × 5.14 mm, which was estimated from a residual image of the LME analysis.

In addition, the amygdala ROIs in both hemispheres were created from the Talairach and Tornoux (57) atlas. This ROI analysis (S3 Dataset) was performed based on our a priori hypothesis that amygdala function would be commonly associated with both truth-telling and risk propensity, as discussed earlier. The ROIs were resampled to the resolution of the normalized functional image, clipping off voxels that were less than 50% occupied. The beta values within the ROIs were averaged for each of the Sender’s decision-making epochs using 3dROIstats in AFNI. Then three-way repeated measures ANCOVA was used, with the average BOLD signal within each ROI as the dependent variable, the advice choice and hemisphere as the within-subjects variables, and the RTI score as the between-subjects covariate.

Results

Behavioral analysis

The number of instances choosing true advice was not affected by age (ρ(11) = −0.05, p > 0.05), gender (F(1,11) = 1.55, p > 0.05), or educational level (F(2,10) = 2.68, p > 0.05). Moreover, Table 2 shows that the number of instances choosing true advice was not significantly correlated with the total RTI score. Nevertheless, in the subscale analyses, interestingly, the number of instances choosing true advice was correlated only with recreational risks; the Senders with high recreational risks were likely to deliver more true advice than those with low recreational risks. The other dimensions of risk-taking behaviors were not correlated with the number of instances choosing true advice.

thumbnail
Table 2. Spearman’s Rank Correlations between RTI Scores and the Number of the choices of True Advice (N = 13).

https://doi.org/10.1371/journal.pone.0137014.t002

Functional MRI analysis of the whole brain

Our LME results of the whole-brain analysis revealed that there were main effects of advice choice, as well as the standardized RTI score, on functional brain activity (see Table 3). Specifically, when subjects served as the Sender and chose true advice, the following brain regions showed increased hemodynamic activity: bilateral anterior rostral medial prefrontal cortex (armPFC), bilateral middle frontal cortex, right temporoparietal junction (TPJ), bilateral frontopolar prefrontal cortex (fpPFC), and right precuneus (see Fig 2). Furthermore, when the subjects exhibited a high RTI score, they showed decreased right middle fronto-cortical activity during decision-making as the Sender (Fig 3). Finally, there was no interaction effect between the advice choice and the RTI score.

thumbnail
Table 3. Linear Mixed-Effects Models of the Whole-Brain Analysis of BOLD Responses in Relation to the Advice Choice and Risk Propensity (N = 13).

https://doi.org/10.1371/journal.pone.0137014.t003

thumbnail
Fig 2. The main effects of the choices of true advice (relative to false advice) on functional brain activation when subjects played the role of the Sender.

R = right hemisphere; armPFC = anterior rostral medial prefrontal cortex; PCUN = precuneus; TPJ = temporoparietal junction; MFC = middle frontal cortex; fpPFC = frontopolar prefrontal cortex.

https://doi.org/10.1371/journal.pone.0137014.g002

thumbnail
Fig 3. The main effects of the total RTI score on functional brain activation when subjects played the role of the Sender.

R = right hemisphere; MFC = middle frontal cortex.

https://doi.org/10.1371/journal.pone.0137014.g003

Functional MRI analysis of the amygdala

Table 4 shows the results of amygdala activity during decision-making as the Sender. Note that two subjects were excluded from this analysis because they never chose false advice across trials. Our results yielded that the advice choice was not associated with bilateral amygdala activity. In contrast, the effect of the RTI score on amygdala activity (across hemispheres) was significant. Follow-up Spearman correlation indicated that the RTI score was negatively associated with amygdala activity (ρ(8) = −0.76 p < 0.01); when subjects, acting as the Sender, showed higher RTI score, they were likely to exhibit decreased amygdala activity during decision making, regardless of the advice choice or hemisphere (see Fig 4).

thumbnail
Table 4. Repeated measures ANCOVA of the Amygdala BOLD Responses in Relation to the Advice Choice, Risk Propensity, and Hemisphere (N = 11).

https://doi.org/10.1371/journal.pone.0137014.t004

thumbnail
Fig 4. The effects of the RTI score on functional amygdala activation during decision-making of advice when subjects played the role of the Sender.

The y-axis represents β coefficients. R = right hemisphere.

https://doi.org/10.1371/journal.pone.0137014.g004

Discussion

We conducted a novel exploration of functional brain responses to the choices of truth telling and risk propensity in asymmetric information. We hypothesized that when subjects performed as the Sender they would show functional brain changes, especially the amygdala, during the choices of true advice, depending on their risk propensity. The study revealed three major findings.

First, risk propensity was not significantly correlated with the choices of true advice. This was slightly surprising because previous studies have suggested the relationship between trusting behavior and risk propensity [1, 50]. Our lack of finding this correlation might be due to the small sample size. However, when each of different types of risk propensity were tested, recreational risk-taking type was significantly correlated with the choices of true advice. That is, the higher individuals showed in the recreational aspects of risk propensity (e.g., rock-climbing, scuba diving), the more likely they were to tell the truth to the partner. This is reasonable because some studies have reported that individuals with high recreational risks are inclined to show openness and curiosity for a novel situation [62, 63], which might motivate them to pursue adventurous interaction with a stranger and to tell the truth to the stranger. Future research needs to scrutinize the relationship between the specific domain of recreational risks and truth telling.

Whether subjects told the truth or a lie in asymmetric information was determined by not only the behavioral factor of recreational risk propensity but also the neurobiological factor of functional cortical activation, and this was our second major finding. When the subjects played the role of the Sender, some of their cortices showed increased functional activation in response to the choices of true advice. This cortical activation specifically included the bilateral armPFC, bilateral middle frontal cortex, right TPJ, bilateral fpPFC, and right precuneus. These findings suggest that the bilateral prefrontal cortex, as well as TPJ and precuneus in the right hemisphere, may be involved in the neural processing of truth telling under asymmetric information. Consistent with these fMRI results, previous studies using voluntary trust games have shown that increased activation in the armPFC is related to both reciprocity and trust [24, 25], and increased activation in the bilateral fpPFC and right TPJ is exclusively associated with trusting behavior [25]. Note that voluntary trust games assess trusting behavior under perfect information, where no players have more or less information to play the games than another. In contrast, our Communication Game measured truth-telling behavior under asymmetric information, where the Sender had superior information regarding available payment pairs and served as an advisor for the partner. In spite of these different relational contexts between voluntary trust games and the Communication Game, we obtained common neuroimaging evidence between the studies. In other words, our results extended previous findings, such as that the bilateral armPFC, bilateral fpPFC, and right TPJ are involved in not only decisions to trust in perfect information but also decisions to tell the truth in asymmetric information. This suggests the possibility that the bilateral armPFC, bilateral fpPFC, and right TPJ are general neuro-biomarkers for decisions to tell the truth in economic transactions.

The armPFC and TPJ are connected to each other anatomically [64, 65] and functionally [66], and these regions, along with the precuneus, have been considered as parts of the neural network for the mentalizing system [6668] or theory of mind (ToM) [6971], both of which refer to the ability to attribute, reason about, and represent the mental states of another person [72]. While the neural network of the mentalizing system/ToM consists of multiple brain regions [67, 69, 70], the armPFC, TPJ, and precuneus seem to be particularly important in philosophical reasoning and trust in communication, such as true/false belief reasoning [73], the beliefs in moral judgment [74], understanding and predicting other people’s intensions [75], and cooperation and deception [76, 77]. The current results of functional responses to the choices of true advice in the armPFC, TPJ, and precuneus seem to be consistent with the above findings, and they suggest that the neural mechanism underlying the mentalizing system/ToM presumably influences decisions to tell the truth in asymmetric information.

The above findings also suggest that dysfunctional armPFC, TPJ, and/or precuneus may be the sign of high risk for social communication problems regarding honesty and deception. For example, autistic children show atypically decreased responses to ToM tasks in the medial PFC and TPJ [78, 79], and exhibit impaired ToM-related behaviors [80, 81], such as “too much honesty/truth telling” [82]. Although speculative, these studies, as well as our current results, may suggest that increased PFC-TPJ responses to truth telling plays a role in executing context-appropriate moral judgments of truth telling.

In addition to the armPFC, TPJ, and precuneus, our results also identified greater functional activation in the fpPFC in both hemispheres during the choices of true advice. The fpPFC partly functions as the representation of a long-term goal-oriented sequence of multiple social events and rules [83], such as subgoal processing of problem-solving (i.e., procedural planning of achieving a first subgoal before higher-order subgoals are satisfied) [84]. Thus, our results might indicate that decisions to tell the truth were motivated at the base of the long-term prospects of consequences and benefits of repeatedly delivering true advice, which was reflected as greater activation in the fpPFC during the choices of true advice in the present study [25, 83].

Furthermore, we found increased activity in the bilateral middle frontal cortex when subjects chose true advice. A previous study has reported that this cortical area, especially Brodmann area 8, shows greater activation as individuals experience increased uncertainty in social events [85]. Our Communication Game indeed created some degrees of uncertainty in social interactions between the Sender and the Receiver, because, for example, the Sender was uncertain about whether the partner would follow her/his advice. Hence, our findings of increased activity in the middle frontal cortex in the Senders might reflect their uncertainty about the partner’s subsequent decisions to trust. However, our results additionally revealed that the right middle frontal cortex showed decreased activity in individuals with high risk propensity. This might reflect that high risk-takers were less concerned with uncertainty or a risk of a negative outcome, such as their advice being rejected by the partner, in the Communication Game than low risk-takers.

The third main finding was association of the amygdala response with risk propensity, regardless of the advice choice. When subjects were high in risk propensity, they bilaterally showed decreased amygdala activity during the decision-making of advice for the partner. Although this finding was not directly related to our primary interest in truth telling, it may suggest that low risk-takers tend to have increased amygdala activity during social interactions. The relationship between the amygdala and social perception has been reported, such that increased activation in the amygdala is found in response to untrustworthy faces [1113, 15], although the relationship may not be monotonic [14, 15]. Therefore, it may be possible that, as compared to high risk-takers, low risk-takers show negative bias in social perception of the partner (e.g., as being more untrustworthy) during social interactions.

Alternatively, the difference in amygdala activity to the choices of advice between low and high risk-takers may reflect the intensity of their social vigilance. Previous studies have shown that increased amygdala activation is related to experience of increased social vigilance [12, 86]. In the present study, low risk-takers showed greater amygdala activation during the choices of advice than high risk-takers, suggesting that low risk-takers possibly faced more vigilance with the partner as compared to high risk-takers. In contrast, high risk-takers may show blunted amygdala response to such social vigilance. Future research needs to examine the relationship between risk propensity, social perception/vigilance, and functional amygdala activity during social interactions.

Therefore, our neuroimaging results revealed that truth telling was associated with increased activation in the armPFC, TPJ, precuneus, middle FC, and fpPFC. We suggest that these identified brain regions may be biomarkers for truth telling under asymmetric information, which can be applicable to practical settings. For instance, trust in a therapeutic relationship between a mental health practitioner and a patient, called therapeutic alliance, is critical to predict treatment success [8, 9, 2933]. To assist building a therapeutic alliance, both practitioner and patient are required to tell the truth under asymmetric information. That is, while the practitioner needs to tell the truth/appropriateness about diagnosis, prognosis, and/or treatment, the patient also needs to tell the truth about their symptoms. By measuring and presenting brain activities (especially a practitioner’s activities) identified in the present study, it may be possible to overcome a patient’s skepticism and to convince the patient to follow the practitioner’s advice, which would contribute to the development of therapeutic alliance.

Although our results showed novel and significant implications in the neural activity for truth telling in the transaction of asymmetric information, the current study had a limitation. Our data were based on N = 13 subjects, which was not large sample. Thus, it may be necessary confirm our findings with a larger sample size, although the number of false positives should not be affected by a small sample size [87]. Another study limitation was that most subjects delivered true advice more often than false advice (see Table 1), which might slightly bias BOLD signal contrasts between the two conditions. This biased behavioral pattern is consistent with previous findings [26]. Similar to the above issue, we found that, when subjects acted as the Receiver, the majority of them overwhelmingly followed advice. Consequently, the present study made it difficult to obtain BOLD contrasts between following advice (86.1% of all trials on average) and not following advice (12.6% of all trials on average), although this analysis was additionally interesting. Future studies need to increase behavioral variance to balance the frequency of responses as the Sender and Receiver respectively, as well as focusing on functional brain activity during the performances as the Receiver.

In conclusion, the present study examined the brain activity in relation to truth telling and risk propensity under asymmetric information. It was found that increased frequency of truth telling was associated with increased recreational type of risk propensity. In addition, our LME models of the whole-brain analysis revealed that truth telling led to greater functional activation in the bilateral armPFC, bilateral fpPFC, bilateral middle frontal cortex, right TPJ, and right precuneus, while right middle fronto-cortical activity was additionally influenced by risk propensity. Finally, there was significant effect of risk propensity on amygdala response during decision-making of advice, such that low risk-takers showed elevated amygdala response. This study provided social implications regarding the neural system for truth telling in asymmetric information.

Supporting Information

S1 Dataset. Demographic and Behavioral Data.

The variable for “sex” is coded as 1 = female and 2 = male. The variable for “education” is coded as 5 = Some college or technical school (at least one year), 6 = College graduate, and 7 = Graduate professional training (Masters or above).

https://doi.org/10.1371/journal.pone.0137014.s001

(XLSX)

S2 Dataset. Functional Whole Brain Data.

The variable for “advice selection” is coded as -1 = false advice and 1 = true advice. All values of functional brain activity indicate average β of BOLD signals at the peak voxel within each identified region.

https://doi.org/10.1371/journal.pone.0137014.s002

(XLSX)

S3 Dataset. Functional Amygdala Data.

The variable for “advice selection” is coded as -1 = false advice and 1 = true advice. All values of functional amygdala activity indicate average β of BOLD signals at the peak voxel within each hemisphere of the amygdala.

https://doi.org/10.1371/journal.pone.0137014.s003

(XLSX)

S1 Table. Probabilities of the Computer-Programmed Partner’s Responses.

In the present study the player partner was actually performed by computer programming across all trials; the player partner was programmed to choose true/false advice or to follow/disregard advice with varying probabilities, depending on subjects’ response at the previous trial with details presented in S1 Table.

https://doi.org/10.1371/journal.pone.0137014.s004

(DOCX)

Acknowledgments

This research was supported by the Laureate Institute for Brain Research and the William K. Warren Foundation. The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author Contributions

Conceived and designed the experiments: JB FK HS MM. Performed the experiments: HS MM JB. Analyzed the data: HS MM. Contributed reagents/materials/analysis tools: MM FK JB. Wrote the paper: HS JB FK MM. Contributed during the whole research project, discussed research concept and design, data acquisition, and data processing strategies: HS MM FK JB.

References

  1. 1. Das TK, Teng B-S. The risk-based view of trust: A conceptual framework. Journal of Business and Psychology. 2004;19(1):85–116.
  2. 2. De Jong BA, Dirks KT. Beyond shared perceptions of trust and monitoring in teams: Implications of asymmetry and dissensus. Journal of Applied Psychology. 2012;97(2):391–406. pmid:22181679
  3. 3. Lusher D, Kremer P, Robins G. Cooperative and competitive structures of trust relations in teams. Small Group Research. 2014;45(1):3–36.
  4. 4. La Porta R, Lopez-de-Silanes F, Shleifer A, Vishny R. Trust in large organizations. American Economic Review Papers and Proceedings. 1997;87(2):333–8.
  5. 5. Hansen MH, Morrow JL Jr, Batista JC. The impact of trust on cooperative membership retention, performance, and satisfaction: An exploratory study. International Food and Agribusiness Management Review. 2002;5:41–59.
  6. 6. Staples DS, Webster J. Exploring the effects of trust, task interdependence and virtualness on knowledge sharing in teams. Information Systems Journal. 2008;18(6):617–40.
  7. 7. Dirks KT. The effects of interpersonal trust on work group performance. The Journal of applied psychology. 1999;84(3):445–55. pmid:10380424.
  8. 8. Lambert MJ, Bergin AF. The effectiveness of psychotherapy. In: Bergin AE, Garfield SL, editors. Handbook of psychotherapy and behavior change. 4th edition ed. Oxford, UK: Wiley; 1994. p. 143–89.
  9. 9. Ledley DR, Marx BP, Heimberg RG. Making cognitive-behavioral theory work: Clinical process for new practitioners. 2nd Edition ed. New York, NY: Guilford Press; 2010.
  10. 10. Winston JS, Strange BA, O'Doherty J, Dolan RJ. Automatic and intentional brain responses during evaluation of trustworthiness of faces. Nature neuroscience. 2002;5(3):277–83. pmid:11850635.
  11. 11. Engell AD, Haxby JV, Todorov A. Implicit trustworthiness decisions: automatic coding of face properties in the human amygdala. Journal of cognitive neuroscience. 2007;19(9):1508–19. pmid:17714012.
  12. 12. Todorov A, Engell AD. The role of the amygdala in implicit evaluation of emotionally neutral faces. Social cognitive and affective neuroscience. 2008;3(4):303–12. pmid:19015082; PubMed Central PMCID: PMC2607057.
  13. 13. Platek SM, Krill AL, Wilson B. Implicit trustworthiness ratings of self-resembling faces activate brain centers involved in reward. Neuropsychologia. 2009;47(1):289–93. pmid:18761362.
  14. 14. Said CP, Baron SG, Todorov A. Nonlinear amygdala response to face trustworthiness: contributions of high and low spatial frequency information. Journal of cognitive neuroscience. 2009;21(3):519–28. pmid:18564045.
  15. 15. Freeman JB, Stolier RM, Ingbretsen ZA, Hehman EA. Amygdala responsivity to high-level social information from unseen faces. The Journal of neuroscience: the official journal of the Society for Neuroscience. 2014;34(32):10573–81. pmid:25100591.
  16. 16. Seeley WW, Menon V, Schatzberg AF, Keller J, Glover GH, Kenna H, et al. Dissociable intrinsic connectivity networks for salience processing and executive control. The Journal of neuroscience: the official journal of the Society for Neuroscience. 2007;27(9):2349–56. pmid:17329432; PubMed Central PMCID: PMC2680293.
  17. 17. Eckert MA, Menon V, Walczak A, Ahlstrom J, Denslow S, Horwitz A, et al. At the heart of the ventral attention system: the right anterior insula. Human brain mapping. 2009;30(8):2530–41. pmid:19072895; PubMed Central PMCID: PMC2712290.
  18. 18. Berg J, Dickhaut J, McCabe K. Trust, receiprocity, and social history. Games and Economic Behavior. 1995;10(1):122–42.
  19. 19. King-Casas B, Tomlin D, Anen C, Camerer CF, Quartz SR, Montague PR. Getting to know you: reputation and trust in a two-person economic exchange. Science. 2005;308(5718):78–83. pmid:15802598.
  20. 20. Delgado MR, Frank RH, Phelps EA. Perceptions of moral character modulate the neural systems of reward during the trust game. Nature neuroscience. 2005;8(11):1611–8. pmid:16222226.
  21. 21. Moretto G, Sellitto M, di Pellegrino G. Investment and repayment in a trust game after ventromedial prefrontal damage. Frontiers in human neuroscience. 2013;7:593. pmid:24093013; PubMed Central PMCID: PMC3782646.
  22. 22. Koscik TR, Tranel D. The human amygdala is necessary for developing and expressing normal interpersonal trust. Neuropsychologia. 2011;49(4):602–11. pmid:20920512; PubMed Central PMCID: PMC3056169.
  23. 23. van Honk J, Eisenegger C, Terburg D, Stein DJ, Morgan B. Generous economic investments after basolateral amygdala damage. Proceedings of the National Academy of Sciences of the United States of America. 2013;110(7):2506–10. pmid:23341614; PubMed Central PMCID: PMC3574920.
  24. 24. Krueger F, McCabe K, Moll J, Kriegeskorte N, Zahn R, Strenziok M, et al. Neural correlates of trust. Proceedings of the National Academy of Sciences of the United States of America. 2007;104(50):20084–9. pmid:18056800; PubMed Central PMCID: PMC2148426.
  25. 25. Krueger F, Grafman J, McCabe K. Neural correlates of economic game playing. Philosophical transactions of the Royal Society of London Series B, Biological sciences. 2008;363(1511):3859–74. pmid:18829425; PubMed Central PMCID: PMC2581786.
  26. 26. Rode J. Truth and trust in communication: Experiments on the effect of a competitive context. Games and Economic Behavior. 2010;68(1):325–38.
  27. 27. Akerlof G. The market for "Lemons": Quality uncertainty and the market mechanism. Quarterly Journal of Economics. 1970;84(3):488–500.
  28. 28. Nayyar PR. Information asymmetries: A source of competitive advantage for diversified service firms. Strategic Management Journal. 1990;11:513–9.
  29. 29. Safran J, Muran JC. Negotiating the therapeutic alliance. New York: The Guilford Press; 2000.
  30. 30. Ahn H, Wampold B. Where oh where are the specific ingredients?: A meta-analysis of component studies in counseling and psychotherapy. Journal of Counseling Psychology. 2001;48(3):251–7.
  31. 31. Luborsky L, Barber JP, Siqueland L, McLellan AT, Woody G. Establishing a therapeutic alliance with substance abusers. NIDA research monograph. 1997;165:233–44. pmid:9243553.
  32. 32. Krupnick JL, Sotsky SM, Simmens S, Moyer J, Elkin I, Watkins J, et al. The role of the therapeutic alliance in psychotherapy and pharmacotherapy outcome: findings in the National Institute of Mental Health Treatment of Depression Collaborative Research Program. Journal of consulting and clinical psychology. 1996;64(3):532–9. pmid:8698947.
  33. 33. Joe GW, Simpson DD, Dansereau DF, Rowan-Szal GA. Relationships between counseling rapport and drug abuse treatment outcomes. Psychiatric services. 2001;52(9):1223–9. pmid:11533397.
  34. 34. Greene JD, Paxton JM. Patterns of neural activity associated with honest and dishonest moral decisions. Proceedings of the National Academy of Sciences of the United States of America. 2009;106(30):12506–11. pmid:19622733; PubMed Central PMCID: PMC2718383.
  35. 35. Parks CD, Hulbert LG. High and low trusters' responses to fear in a payoff matrix. Journal of Conflict Resolution. 1995;39:718–30.
  36. 36. Bohnet I, Zeckhauser R. Trust, risk and betrayal. Journal of Economic Behavior & Organization. 2004;55:467–84.
  37. 37. Crawford V, Sobel J. Strategic information transmission. Econometrica. 1982;50(6):1431–51.
  38. 38. Gneezy U. Deception: The role of consequences. American Economic Review. 2005;95(1):384–94.
  39. 39. Baas D, Aleman A, Vink M, Ramsey NF, de Haan EH, Kahn RS. Evidence of altered cortical and amygdala activation during social decision-making in schizophrenia. NeuroImage. 2008;40(2):719–27. pmid:18261933.
  40. 40. Pinkham AE, Hopfinger JB, Pelphrey KA, Piven J, Penn DL. Neural bases for impaired social cognition in schizophrenia and autism spectrum disorders. Schizophrenia research. 2008;99(1–3):164–75. pmid:18053686; PubMed Central PMCID: PMC2740744.
  41. 41. Couture SM, Penn DL, Roberts DL. The functional significance of social cognition in schizophrenia: a review. Schizophrenia bulletin. 2006;32 Suppl 1:S44–63. pmid:16916889; PubMed Central PMCID: PMC2632537.
  42. 42. Adolphs R, Sears L, Piven J. Abnormal processing of social information from faces in autism. Journal of cognitive neuroscience. 2001;13(2):232–40. pmid:11244548.
  43. 43. Hooker CI, Tully LM, Verosky SC, Fisher M, Holland C, Vinogradov S. Can I trust you? Negative affective priming influences social judgments in schizophrenia. Journal of abnormal psychology. 2011;120(1):98–107. pmid:20919787; PubMed Central PMCID: PMC3170843.
  44. 44. Davis M, Whalen PJ. The amygdala: vigilance and emotion. Molecular psychiatry. 2001;6(1):13–34. Epub 2001/03/13. pmid:11244481.
  45. 45. Whalen PJ. The uncertainty of it all. Trends in cognitive sciences. 2007;11(12):499–500. pmid:18024182.
  46. 46. Yan P, Li CS. Decreased amygdala activation during risk taking in non-dependent habitual alcohol users: A preliminary fMRI study of the stop signal task. The American journal of drug and alcohol abuse. 2009;35(5):284–9. pmid:19579091.
  47. 47. Orsini CA, Trotta RT, Bizon JL, Setlow B. Dissociable Roles for the Basolateral Amygdala and Orbitofrontal Cortex in Decision-Making under Risk of Punishment. The Journal of neuroscience: the official journal of the Society for Neuroscience. 2015;35(4):1368–79. pmid:25632115; PubMed Central PMCID: PMC4308589.
  48. 48. Crowley TJ, Dalwani MS, Mikulich-Gilbertson SK, Du YP, Lejuez CW, Raymond KM, et al. Risky decisions and their consequences: neural processing by boys with Antisocial Substance Disorder. PloS one. 2010;5(9):e12835. pmid:20877644; PubMed Central PMCID: PMC2943904.
  49. 49. Li CS, Luo X, Yan P, Bergquist K, Sinha R. Altered impulse control in alcohol dependence: neural measures of stop signal performance. Alcoholism, clinical and experimental research. 2009;33(4):740–50. pmid:19170662; PubMed Central PMCID: PMC2697053.
  50. 50. Cook KS, Yamagishi T, Cheshire C, Cooper R, Matsuda M, Mashima R. Trust building via risk taking: A cross-societal experiment. Social Psychology Quarterly. 2005;68(2):121–42.
  51. 51. First M. Structured Clinical Interview for DSM-IV-TR Axis I Disorders—Non-Patient Edition (SCID-I/NP). New York, NY: New York State Psychiatric Institute; January, 2010.
  52. 52. Holt C, Laury S. Risk aversion and incentive effects. The American Economic Review. 2002;92(5):1644–55.
  53. 53. Tomlin D, Kayali MA, King-Casas B, Anen C, Camerer CF, Quartz SR, et al. Agent-specific responses in the cingulate cortex during economic exchanges. Science. 2006;312(5776):1047–50. pmid:16709783.
  54. 54. Bodurka J, Ledden PJ, van Gelderen P, Chu R, de Zwart JA, Morris D, et al. Scalable multichannel MRI data acquisition system. Magnetic resonance in medicine: official journal of the Society of Magnetic Resonance in Medicine / Society of Magnetic Resonance in Medicine. 2004;51(1):165–71. pmid:14705057.
  55. 55. Cox RW. AFNI: software for analysis and visualization of functional magnetic resonance neuroimages. Computers and biomedical research, an international journal. 1996;29(3):162–73. pmid:8812068.
  56. 56. Glover GH, Li TQ, Ress D. Image-based method for retrospective correction of physiological motion effects in fMRI: RETROICOR. Magnetic resonance in medicine: official journal of the Society of Magnetic Resonance in Medicine / Society of Magnetic Resonance in Medicine. 2000;44(1):162–7. pmid:10893535.
  57. 57. Talairach J, Tornoux P. Co-planar stereotaxic atlas of the human brain: 3-dimensional proportional system: an approach to cerebral imaging. Stuttgart: Georg Thieme; 1988. 122 p. p.
  58. 58. Avants BB, Epstein CL, Grossman M, Gee JC. Symmetric diffeomorphic image registration with cross-correlation: evaluating automated labeling of elderly and neurodegenerative brain. Medical image analysis. 2008;12(1):26–41. pmid:17659998; PubMed Central PMCID: PMC2276735.
  59. 59. Rissman J, Gazzaley A, D'Esposito M. Measuring functional connectivity during distinct stages of a cognitive task. NeuroImage. 2004;23(2):752–63. pmid:15488425.
  60. 60. Pinheiro JC, Bates DM. Mixed-effects models in S and S-PLUS. Chambers J, Eddy W, Hardle W, Sheather S, Tierney L, editors. New York: Springer; 2000. 528 p.
  61. 61. R Development Core Team. R: A language and environment for statistical computing. Vienna, Austria: The R Foundation for Statistical Computing; 2011.
  62. 62. Weller JA, Tikir A. Predicting domain-specific risk taking with the HEXACO personality structure. Journal of Behavioral Decision Making. 2011;24:180–201.
  63. 63. Soane E, Dewberry C, Narendran S. The role of perceived costs and perceived benefits in the relationship between personality and risk-related choices. Journal of Risk Research. 2010;13(3):303–18.
  64. 64. Barbas H, Ghashghaei H, Dombrowski SM, Rempel-Clower NL. Medial prefrontal cortices are unified by common connections with superior temporal cortices and distinguished by input from memory-related areas in the rhesus monkey. The Journal of comparative neurology. 1999;410(3):343–67. pmid:10404405.
  65. 65. Bachevalier J, Meunier M, Lu MX, Ungerleider LG. Thalamic and temporal cortex input to medial prefrontal cortex in rhesus monkeys. Experimental brain research. 1997;115(3):430–44. pmid:9262198.
  66. 66. Burnett S, Blakemore SJ. Functional connectivity during a social emotion task in adolescents and in adults. The European journal of neuroscience. 2009;29(6):1294–301. pmid:19302165; PubMed Central PMCID: PMC2695858.
  67. 67. Frith U, Frith CD. Development and neurophysiology of mentalizing. Philosophical transactions of the Royal Society of London Series B, Biological sciences. 2003;358(1431):459–73. pmid:12689373; PubMed Central PMCID: PMC1693139.
  68. 68. Burnett S, Bird G, Moll J, Frith C, Blakemore SJ. Development during adolescence of the neural processing of social emotion. Journal of cognitive neuroscience. 2009;21(9):1736–50. pmid:18823226.
  69. 69. Mahy CE, Moses LJ, Pfeifer JH. How and where: theory-of-mind in the brain. Developmental cognitive neuroscience. 2014;9:68–81. pmid:24552989.
  70. 70. Carrington SJ, Bailey AJ. Are there theory of mind regions in the brain? A review of the neuroimaging literature. Human brain mapping. 2009;30(8):2313–35. pmid:19034900.
  71. 71. Gobbini MI, Koralek AC, Bryan RE, Montgomery KJ, Haxby JV. Two takes on the social brain: a comparison of theory of mind tasks. Journal of cognitive neuroscience. 2007;19(11):1803–14. pmid:17958483.
  72. 72. Amodio DM, Frith CD. Meeting of minds: the medial frontal cortex and social cognition. Nature reviews Neuroscience. 2006;7(4):268–77. pmid:16552413.
  73. 73. Sommer M, Dohnel K, Sodian B, Meinhardt J, Thoermer C, Hajak G. Neural correlates of true and false belief reasoning. NeuroImage. 2007;35(3):1378–84. pmid:17376703.
  74. 74. Young L, Saxe R. The neural basis of belief encoding and integration in moral judgment. NeuroImage. 2008;40(4):1912–20. pmid:18342544.
  75. 75. Ciaramidaro A, Adenzato M, Enrici I, Erk S, Pia L, Bara BG, et al. The intentional network: how the brain reads varieties of intentions. Neuropsychologia. 2007;45(13):3105–13. pmid:17669444.
  76. 76. Lissek S, Peters S, Fuchs N, Witthaus H, Nicolas V, Tegenthoff M, et al. Cooperation and deception recruit different subsets of the theory-of-mind network. PloS one. 2008;3(4):e2023. pmid:18431500; PubMed Central PMCID: PMC2295259.
  77. 77. Ganis G, Kosslyn SM, Stose S, Thompson WL, Yurgelun-Todd DA. Neural correlates of different types of deception: an fMRI investigation. Cerebral cortex. 2003;13(8):830–6. pmid:12853369.
  78. 78. O'Nions E, Sebastian CL, McCrory E, Chantiluke K, Happe F, Viding E. Neural bases of Theory of Mind in children with autism spectrum disorders and children with conduct problems and callous-unemotional traits. Developmental science. 2014;17(5):786–96. pmid:24636205; PubMed Central PMCID: PMC4316185.
  79. 79. Kana RK, Keller TA, Cherkassky VL, Minshew NJ, Just MA. Atypical frontal-posterior synchronization of Theory of Mind regions in autism during mental state attribution. Social neuroscience. 2009;4(2):135–52. pmid:18633829; PubMed Central PMCID: PMC3086301.
  80. 80. Baron-Cohen S, Leslie AM, Frith U. Does the autistic child have a "theory of mind"? Cognition. 1985;21(1):37–46. pmid:2934210.
  81. 81. Broekhof E, Ketelaar L, Stockmann L, van Zijp A, Bos MG, Rieffe C. The Understanding of Intentions, Desires and Beliefs in Young Children with Autism Spectrum Disorder. Journal of autism and developmental disorders. 2015. pmid:25636676.
  82. 82. Jaarsma P, Gelhaus P, Welin S. Living the categorical imperative: autistic perspectives on lying and truth telling-between Kant and care ethics. Medicine, health care, and philosophy. 2012;15(3):271–7. pmid:22065242.
  83. 83. Wood JN, Grafman J. Human prefrontal cortex: processing and representational perspectives. Nature reviews Neuroscience. 2003;4(2):139–47. pmid:12563285.
  84. 84. Braver TS, Bongiolatti SR. The role of frontopolar cortex in subgoal processing during working memory. NeuroImage. 2002;15(3):523–36. pmid:11848695.
  85. 85. Volz KG, Schubotz RI, von Cramon DY. Variants of uncertainty in decision-making and their neural correlates. Brain research bulletin. 2005;67(5):403–12. pmid:16216687.
  86. 86. Whalen PJ, Shin LM, McInerney SC, Fischer H, Wright CI, Rauch SL. A functional MRI study of human amygdala responses to facial expressions of fear versus anger. Emotion. 2001;1(1):70–83. Epub 2003/08/05. pmid:12894812.
  87. 87. Friston K. Ten ironic rules for non-statistical reviewers. NeuroImage. 2012;61(4):1300–10. pmid:22521475.