Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Loneliness and Hypervigilance to Social Cues in Females: An Eye-Tracking Study

  • Gerine M. A. Lodder ,

    g.lodder@pwo.ru.nl

    Affiliation Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands

  • Ron H. J. Scholte,

    Current address: Praktikon, Nijmegen, The Netherlands

    Affiliation Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands

  • Ivar A. H. Clemens,

    Affiliation Donders Institute for Brain, Cognition, and Behaviour, Radboud University, Nijmegen, The Netherlands

  • Rutger C. M. E. Engels,

    Current address: The Trimbos Institute, Netherlands Institute of Mental Health and Addiction, Utrecht, The Netherlands

    Affiliation Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands

  • Luc Goossens,

    Affiliation Research Group School Psychology and Child and Adolescent Development, KU Leuven, Leuven, Belgium

  • Maaike Verhagen

    Affiliation Behavioural Science Institute, Radboud University, Nijmegen, The Netherlands

Abstract

The goal of the present study was to examine whether lonely individuals differ from nonlonely individuals in their overt visual attention to social cues. Previous studies showed that loneliness was related to biased post-attentive processing of social cues (e.g., negative interpretation bias), but research on whether lonely and nonlonely individuals also show differences in an earlier information processing stage (gazing behavior) is very limited. A sample of 25 lonely and 25 nonlonely students took part in an eye-tracking study consisting of four tasks. We measured gazing (duration, number of fixations and first fixation) at the eyes, nose and mouth region of faces expressing emotions (Task 1), at emotion quadrants (anger, fear, happiness and neutral expression) (Task 2), at quadrants with positive and negative social and nonsocial images (Task 3), and at the facial area of actors in video clips with positive and negative content (Task 4). In general, participants tended to gaze most often and longest at areas that conveyed most social information, such as the eye region of the face (T1), and social images (T3). Participants gazed most often and longest at happy faces (T2) in still images, and more often and longer at the facial area in negative than in positive video clips (T4). No differences occurred between lonely and nonlonely participants in their gazing times and frequencies, nor at first fixations at social cues in the four different tasks. Based on this study, we found no evidence that overt visual attention to social cues differs between lonely and nonlonely individuals. This implies that biases in social information processing of lonely individuals may be limited to other phases of social information processing. Alternatively, biased overt attention to social cues may only occur under specific conditions, for specific stimuli or for specific lonely individuals.

Introduction

Loneliness, defined as a negative emotional response to a discrepancy between the desired and actual quality or quantity of interpersonal relationships [1], can have severe consequences for physical and mental health, including higher morbidity and mortality [2]. Earlier research suggested that loneliness may be related to heightened attention for social cues in the environment, which can be used to prevent social rejection and promote opportunity for inclusion [3,4,5]. Various types of psychopathology related to loneliness such as depression and social anxiety have also been linked to biased processing of social information. Depression is mainly linked to biases in post-attentive processing such as remembering negative events more clearly, and social anxiety is primarily linked to selective visual attention to negative social cues (i.e., hypervigilance for negative social cues) [6]. So far, research on the link between loneliness and processing of social cues has predominantly focused on the post-attentive processing stage. For instance, loneliness has been linked to negative interpretation of social cues and enhanced memory for social cues [4,7]. However, research on the relation between loneliness and visual attention (i.e., gazing at social cues), which is an earlier step of social information processing, is very limited. It is thus unclear whether biased perception of social cues in lonely individuals is merely related to biased post-attentive processing of (negative) social cues, similar to depression, or also to biased visual attention to negative social cues, similar to social anxiety [8]. In order to gain insight into this stage of social information processing, the goal of the present study was to examine whether lonely individuals show hypervigilance to negative social cues in terms of increased overt visual attention to social cues.

Social information processing consists of several steps that are interrelated [9]. Roughly, a distinction can be made between pre-attentive evaluation of cues (e.g., automatic detection of threatening cues in the peripheral vision), the allocation of attention to cues (e.g., overt visual attention or gazing at cues, and covert attention to cues), post-attentive evaluation of cues (e.g., comparing information with memory and interpretation), and the response to cues (e.g., sustained attention to the cue) [8,9]. Biased processing of social information can occur at each stage of social processing. Biases in different stages of social information processing have been linked to distinct forms of psychopathology, indicating that different mechanisms of biased information processing may underlie each of these forms of psychopathology [6,8]. This distinction is important, because knowledge of the stages in which biased processing of social cues occurs may lead to intervention opportunities. For example, if loneliness is indeed related to biased visual attention to negative social information, training lonely individuals to attend to these cues less (i.e., attention modification training) [10] may be a useful tool, whereas this is not the case if loneliness is not related to biased attention to social cues.

Social needs models of loneliness indicate that the need to belong, a fundamental need to initiate and maintain close relationship with others, is at the core of loneliness, because loneliness arises when this need is not met [4,1113]. Feelings of unsatisfactory social connections may instigate the tendency to restore the level of belongingness [14]. One way in which individuals can restore their level of belonging is by changing the way they attend to, perceive, and react to social cues in their environment [4,11]. Social cues can inform an individual about potential rejection or acceptance. According to the hypervigilance theory, lonely individuals may have a heightened focus specifically on negative social cues, because these cues signal rejection [3,11,15,16]. Research has provided ample evidence that loneliness is related to biases in post-attentive processing of social cues. For instance, loneliness was related to increased memory for both positive and negative social events [4], more negative expectations about the way one is perceived [17,18] and more negative perceptions of social interactions [19,20]. Research on one of the first steps of social processing, overt visual attention to (i.e., gazing at) social cues, is very limited [5]. It is therefore unclear whether loneliness is exclusively related to enhanced memory for and negative interpretation of social cues, or also to hypervigilance to negative social information (i.e., biased visual attention to negative social cues).

On the one hand, based on the hypervigilance theory for loneliness, we could assume that loneliness is indeed related to hypervigilance for negative social cues [3,11]. Biases in post-attentive cognitive processing of negative social information that are apparent in lonely individuals may in fact be the result of increased visual attention to these negative cues. Earlier research showed that people show increased processing of visual stimuli that are gazed at, compared to stimuli that are not gazed at [21]. Thus, lonely individuals might give more overt visual attention to negative social cues, which elicits increased processing which may in turn explain negative interpretation of and increased memory for these cues by lonely individuals [4]. In line with this, a study examining gazing behavior at playground video scenes showed that higher loneliness was related to a higher likelihood to fixate first on rejection cues rather than other cues, which held especially for the loneliest individuals [5]. Children who were extremely lonely were found to have difficulties disengaging from rejection cues [7]. This suggests that loneliness may be related to biased overt visual attention to negative social information, in addition to biases in post-attentive processing.

On the other hand, biased processing of social cues in post-attentive processing by lonely individuals may not be the result of increased overt visual attention to social cues, but rather originate in later steps of information processing. Although the location the eyes are directed at is certainly related to what is given conscious attention to [22], shifts in attention are not necessarily accompanied by shifts in gaze [23]. Earlier research showed that differences in overt visual attention could not explain enhanced memory for emotional cues compared to neutral cues [24]. In addition, threatening cues in the environment seem to be processed by the amygdala independent of whether overt visual attention was given to these cues or not [21]. This indicates that biased post-attentive processing of certain cues is not necessarily the result of biased visual attention to these cues. Indeed, interpretation of social cues is influenced not only by the amount of overt visual attention that was given to these cues, but also by expectancies and attitudes of the perceiver [9]. Although little is known about biases in visual attention to social cues by lonely individuals, earlier studies showed that for instance in depression and borderline personality disorder, biases only occur in later stages of social information processing [6,25]. The same could be true for lonely individuals. Thus, biased processing of social cues in lonely individuals may not be due to hypervigilance for these cues, but rather originate in post-attentive processing.

The present study

In the present study, we examined whether lonely and nonlonely individuals differ in their hypervigilance to negative social cues. We used eye-tracking equipment to measure participants' gazing behavior at social cues. Eye-tracking is highly suitable to objectively assess the ways in which individuals look at—rather than interpret and represent—social cues [26]. We mainly focused on overt visual attention to facial expressions. Emotions, or more specifically facial expressions, reflect the state of mind of the person expressing them, and may thus provide information about opportunities for inclusion (for emotions with a positive valence) or provide a warning for rejection (for emotions with a negative valence) [27,28]. The face conveys these emotions and is therefore considered to be the most important source of social information [29,30].

We used four tasks to examine differences in overt visual attention to social cues between lonely and non-lonely participants. In the first task, the Face Task, we presented participants with images of neutral faces and emotional faces. Non-clinical samples of adults tend to spend most time gazing at the eyes, nose, and mouth region of the face, of which the eyes seem to be the most important to convey social information [3032]. As of yet, it is unknown if lonely individuals focus on different aspects of faces than nonlonely individuals. We examined whether gazing at the eye, nose and mouth region differed between lonely and non-lonely participants. Increased attention to emotion rich areas of the face, such as the eyeregion, could be a sign of hypervigilance for social cues [33].

In the second task, the Emotion Array Task, we simultaneously showed participants an array of four faces, expressing anger, fear, happiness, and a neutral expression. We examined whether participants differed in their overt visual attention to each of these emotions and the neutral expression. Increased visual attention to angry and fearful faces, which are considered to be threatening social cues [30], could be an indication of hypervigilance for threatening social information. In the third task, the Social and Nonsocial Array Task, we used an array of four different types of images, namely positive-social images, negative-social images, positive-nonsocial images, and negative-nonsocial images. Earlier research using comparable stimuli showed that the visual cortex was more strongly activated in lonely than in nonlonely people when viewing negative social images [3]. We extended this research by examining whether lonely individuals show increased overt visual attention to these cues as well, which would be a sign of hypervigilance for negative social information.

In the fourth and final task, the Video Task, we showed participants a range of video clips with a positive or negative valance, involving interactions between people. We examined whether loneliness was related to the degree to which participants gazed at the facial area of the actors in the video clips, which would be a sign of hypervigilance to social cues. Additionally, we examined whether gazing behavior differed between videos with positive or negative content.

Because depression and social anxiety are highly correlated with loneliness, and are related to biased processing of social information, we controlled for these constructs in all analyses [6,34]. We had no theoretical reason to assume that the relationship between loneliness and overt visual attention to social cues may differ between males and females [11]. In addition, most studies find no relation between loneliness and gender [35]. To increase homogeneity in the sample, we therefore used a sample consisting of females only.

Methods

Participants

Participants were recruited from a pool of college students who completed an online questionnaire that was designed as a selection questionnaire for multiple studies. The questionnaire was filled out by 515 students in exchange for course credit. The eye-tracking study took place approximately 2 months after completion of the prescreen questionnaire. In total, 25 nonlonely participants (scoring within the 13% lowest scores within our sample) and 26 lonely participants (scoring within the 10% highest scores within our sample) agreed to participate. The sample comprised female students from social sciences from the Radboud University Nijmegen (The Netherlands) with normal or corrected-to normal eye vision mainly. Calibration could not be completed for one lonely participant, therefore, the final sample consisted of 25 lonely and 25 nonlonely participants. Due to problems in calibration and time-constraints, one lonely and one nonlonely participant were unable to take part in the fourth task (Video Task). Age ranged from 18 to 24 years (M = 19.88, SD = 1.41). Lonely and nonlonely participants did not differ in age.

Procedure

We invited participants into a laboratory at the research institute, where participants were seated in front of a computer screen. Four eye-tracking tasks were completed by the participants. Before each task, we ran a 13-point calibration procedure, with the calibration points being presented in random order. The four tasks were always presented in the same order. Stimuli within tasks were presented in a random order, which was determined using the Random Number Generator in SPSS. Due to time constraints, Task 4 (Video Task) was not completed by one lonely and one nonlonely participant. After calibration, we instructed the participants to “look at the images comfortably, as if you were watching TV”. The four eye-tracking tasks were separated by distraction tasks in which we asked participants to choose between different types of candy and interior designs. In addition, participants were allowed to walk around the room after each task was completed. After the eye-tracking tasks were completed, which took approximately 30 minutes in total, participants completed a questionnaire.

We obtained written informed consent from all participants involved in the study. The study was approved by the Radboud University’s IRB (Ethics Committee Social Sciences).

Task 1 (Face Task).

In the first task, participants viewed 50 images from the Radboud Faces Database [36] in order to examine what area of the face participants gave most visual attention to. These images portrayed 10 individuals (5 males and 5 females), each displaying 4 basic emotions (i.e., happiness, fear, anger, and sadness) and a neutral face. We used 30 additional photos from the same actors (displaying disgust, contempt and surprise) as filler items [37]. All models in the photos were Caucasian, wore a black shirt, and had their hair pulled back. The images were shown for 5 seconds, preceded by a fixation cross that was shown for 1 second. We instructed participants to move their eyes to this fixation cross whenever it appeared.

Task 2 (Emotion Array Task).

In the second task, we compared overt visual attention to several emotions. Participants viewed 22 arrays of 4 images from the Radboud Faces Database [36]. Each array consisted of 4 images from the same actor displaying 3 motions (anger, fear, and happiness) and a neutral face. We selected different actors than the actors used in Task 1 (Face Task), (11 male, 11 female). The arrays were presented for 8 seconds, and were preceded by a fixation cross for 1 second.

Task 3 (Social and Nonsocial Array Task).

In the third task, we examined preference for positive and negative, social and nonsocial images. Participants viewed 20 arrays of 4 images from the International Affective Pictures System (IAPS) [38]. All arrays contained a positive social image (e.g., playing children), a negative social image (e.g., a robbery), a positive nonsocial image (e.g., a cake) and a negative nonsocial image (e.g., a dirty bucket). Images were considered social if they contained at least 2 living humans, and were not sexually arousing in nature. Images were considered nonsocial if they contained no humans, and no animals in social interaction (e.g., a polar bear with a cub). We selected images based on valence and arousal measures of a previous study [38]. Images were included if they were within 1.5 SD of the mean of arousal ratings by females, and if they were in the top 33.3% (positive) and bottom 33.3% (negative) of valence ratings by females (mean arousal +/- 1.5 sd) see [38]. Slide numbers for the included images are included as S1 Table Arrays were shown for 10 seconds separated by a fixation cross for 1 second.

Task 4 (Video Task).

In the fourth task, we examined overt visual attention to social cues in dynamic images. Participants viewed 10 positive and 10 negative fragments of English-language television shows that were never broadcasted or not broadcasted at the time in the Netherlands (e.g., Make it or break it). Each fragment lasted between 29.71 and 32.61 seconds (M = 30.36; SD = .62) and consisted of 890 to 977 frames (M = 907.05; SD = 18.68). The video clips showed two or more actors that were having a conversation with positive or negative content and included sound. The fragments were selected from 53 scenes (24 positive and 29 negative) that were independently rated by three observers on valence (content, tone of voice, and facial expressions), and the absence of distracting elements (e.g., cleavage or opening credits). The most positive and negative fragments without distracting elements were selected for the study.

Apparatus

Participants’ heads were secured so that they would not move during the eye-tracking procedure, and their eye position was fixed at 50 cm from the computer screen. Stimuli were shown on a computer screen with a resolution of 1024 x 768 pixels. Participants’ eye-movements were recorded using an Iview X Hi-Speed 500/1250 eye tracker (SMI, Teltow, Germany). Measures were taken at a 500 Hz rate, by measuring the position of the pupil relative to the corneal reflection.

Eye position samples in which the eyes exceeded a 45°/s velocity threshold were marked as saccades [39]. In order to include the on- and offset of a saccade, we also marked samples where eye velocity was increasing (i.e., acceleration was positive) before and after every marked region. Stable gaze intervals between the saccades that lasted longer than 100 ms defined a fixation and thus served as input for the analyses of visual attention. Saccades were only detected in order to identify fixations, and were not analyzed further. Furthermore, we did not analyze the first 150 ms following each new stimulus, or camera switch (Task 4—Video Task). Earlier research showed that it takes approximately 150 ms to shift attention from one spatial location to another [40]. Because participants’ eyes were still in the location of the fixation cross when a new stimulus appeared, or in the location of a previous shot following a camera switch, we did not analyze these first 150 ms.

Measures

Loneliness.

Loneliness was measured during the selection procedure and after the eye-tracking tasks using a Dutch translation of the UCLA Loneliness Scale Version 3 [41], which consists of 20 items that measure feelings of loneliness and connectedness (e.g., “How often do you feel left out?”). Participants rated every item on a 4-point scale (1 = never to 4 = always), with higher scores reflecting higher feelings of loneliness. Nine items were reverse coded. Loneliness was measured both in the selection questionnaire, and after the eye-tracking tasks. Reliability was high (α = .90) and after the eye-tracking tasks (α = .94).

Depression.

To measure depressive symptoms, a Dutch translation of the Center for Epidemiologic Studies-Depression scale (CES-D) was used [42]. The CES-D consists of 20 items that measure the frequency of depressive symptoms during the past week (e.g., “In the last week I felt that everything I did was an effort”). The items were measured on a 5-point scale ranging from 0 (rarely or none of the times = less than 1 day) to 4 (most or all of the time = 5–7 days), with higher scores reflecting more feelings of depression. Four items were reverse coded (α = .90). One item in the CES-D describes feelings of loneliness (i.e., “I felt lonely”). Because overlap of the scale with and without this item was extremely high (r = .999, p = <.001) we decided to run analysis with the original CES-D scale.

Social anxiety.

Social anxiety was measured using the Social Phobia Inventory (SPIN) [43]. The SPIN consists of 17 items that measure characteristics of social anxiety consisting of fear, avoidance, and physical reactions (e.g., “I avoid talking to people I don’t know”). Participants indicated how often they felt or behaved in a certain way during the past week on a scale ranging from 0 (not at all) to 4 (extremely), with higher scores reflecting more feelings of social anxiety (α = .92).

Eye-tracking measures.

In Task 1 (Face Task), areas of interest were the eyeregion of the face (including eyebrows), the nose, and the mouth, because these areas convey most social information [32] In Task 2 (Emotion Array Task) and 3 (Social and Nonsocial Array Task), each image of each quadrant was an area of interest (AOI, e.g., in Task 2 (Emotion Task), the AOI “happy” was the entire image showing an actor with a happy expression). In Task 4 (Video Task), areas of interest were determined for each frame (total 18,141 frames). We used a bounding box around the facial area as an area of interest. Gazing behavior was only identified for frames in which an area of interest was visible.

We calculated two eye-tracking measures. Total fixation duration is the total time a participant gazed at a specific AOI stimulus in ms. Number of fixations is the number of times a participant fixated on a specific AOI of a single stimulus. In Task 1 (Face Task), we calculated both eye-tracking measures separately for each emotional expression, and aggregated them across trials. Eye-tracking measures were corrected for the total size of each area of interest (i.e., the eye region was larger than the nose-region, measures reported are corrected for these differences). In Tasks 2 (Emotion Array Task) and 3 (Social and Nonsocial Task), measures were aggregated across trials for each type of image (i.e., happy, fearful, angry, and neutral for Task 2, and positive-social, negative-social, positive-nonsocial, and negative-nonsocial for Task 3). We also calculated first fixation for these tasks, indicating the percentage of trials in which participants’ first fixation was on each of the image types. In Task 4 (Video Task), we aggregated fixations across all positive fragments and across all negative fragments.

Analyses

We used a Repeated Measures ANOVA design to analyze all tasks. We analyzed each task separately, with stimuli (e.g., emotion) as a within subject factor and loneliness as a between subject factor. Besides descriptive statistics, results for covariates (depressive symptoms and social anxiety symptoms, both standardized) are included as (S2 Table). Results were similar when we did not include these covariates, with the one exception reported in the text. In addition, this table shows the error variances for all measures.

If Mauchly’s test of sphericity yielded a significant result, the results are reported with Greenhouse-Geisser correction for ε < .75, and with Huynh-Feldt correction for ε ≥ .75 (cf. Field, 2009). For all significant main and interaction effects, we rank ordered factors according to their means and used repeated contrasts to establish which elements differed. For instance, in Task 1 (Face Task) we rank ordered emotions by gaze duration, and contrasted the emotion that was gazed at most often to the emotion that was gazed at second most often and so forth [44].

Results

Descriptive Statistics

Tables 1 and 2 depict means and standard deviations for number of fixations and fixation duration for lonely and nonlonely participants in each task. T-tests showed that lonely participants had higher social anxiety scores (t(45) = 4.54, p < .001) and higher depression scores (t(45) = 5.07, p < .001) than nonlonely participants. We therefore controlled for depression and social anxiety in all further analyses by adding them as covariates to the analyses. The loneliness scores that were collected after the eye-tracking tasks correlated highly with loneliness scores measured for selection purposes (r = .93, p < .001), indicating that rank-order stability of loneliness across a 2-month period was relatively high. Loneliness measured after the eye-tracking tasks significantly differed between lonely and non-lonely individuals (t (34.47) = 17.44, p < .001). Means and SD for first fixations in Task 2 (Emotion Array Task) and 3 (Social and Nonsocial Array Task) are included as S3 Table).

thumbnail
Table 1. Means and Standard Deviations for Gaze Duration in ms and Number of Fixations for Total Sample, Lonely and Nonlonely Participants for Task 1.

https://doi.org/10.1371/journal.pone.0125141.t001

thumbnail
Table 2. Means and Standard Deviations for Gaze Duration in ms and Number of Fixations for Total Sample, Lonely and Nonlonely Participants for Tasks 2, 3 and 4.

https://doi.org/10.1371/journal.pone.0125141.t002

Task 1 (Face Task)

Table 3 shows the results of a 5 (Within Subject; Emotion; anger, fear, happy, sad, and neutral) by 3 (Within Subject; AOI; mouth, nose, eyes) by 2 (Between Subjects; lonely vs. nonlonely) Repeated Measures ANOVA. This analysis was performed to examine whether gazing behavior differed between emotions, AOIs, and between loneliness groups. Results revealed similar findings for gaze duration and number of fixations. Participants gazed longer and more often at some AOI’s than at others (eyes > nose > mouth) (see S1 Fig). Differences in gazing behavior were also found between emotions (anger > [sad = neutral = happy] > fear). The AOI by Emotion interaction showed that differences in gazing behavior between areas of interest were not equal between emotions. The difference between gazing behavior at the eyes compared to the nose were stronger in images showing angry emotions than other emotions (anger > [sad = neutral = happy = fear]). Differences in gazing behavior between nose and mouth were equal between all emotions. There were no main effects or interaction effects for loneliness groups. This indicates that, although there were differences in gazing behavior between emotions and AOIs for the entire sample, gazing behavior did not differ between lonely and nonlonely participants. Fig 1 shows an example for gazing behavior at four emotions. Results were similar when we included the filler items (i.e., contempt, disgust, and surprise) in the analyses.

thumbnail
Table 3. Repeated Measures ANOVA Results for Gaze Duration in ms and Number of Fixations for All Tasks.

https://doi.org/10.1371/journal.pone.0125141.t003

thumbnail
Fig 1. Heatmaps for gazing behavior across all participants for four emotions.

Example heatmap for gazing behavior at different emotional expressions, namely (A) Angry (B) Fearful (C) Happy and (D) sad.

https://doi.org/10.1371/journal.pone.0125141.g001

Task 2 (Emotion Array Task)

Table 3 shows the results of a 4 (Within Subjects; anger, fear, happiness, and neutral) by 2 (Between Subjects; lonely vs. nonlonely) Repeated Measures ANOVA. Repeated contrasts showed that participants’ gaze duration was different for different emotions in the (happiness > [fear = neutral] > anger) (see S2 Fig). For the number of fixations, we also found differences between emotions ([happiness = fear] > neutral > anger). Again, there were no main effects or interaction effects for loneliness, indicating that lonely and nonlonely participants had similar gazing behavior at the different emotions in terms of both gaze duration and number of fixations. In addition, we compared the number of times participants had their first fixation on each of the emotions (see Table 4). Results showed a small but significant effect, indicating that participants were less likely to first look at happy faces than all other faces ([fear = neutral = angry] > happiness). There were no differences between groups, indicating that lonely and non-lonely participants had similar first fixations.

thumbnail
Table 4. Repeated Measures ANOVA Results for Percentage of First Fixations for Task 2 (Emotional Array Task) and 3 (Social and Nonsocial Array Task).

https://doi.org/10.1371/journal.pone.0125141.t004

Task 3 (Social and Nonsocial Array Task)

Table 3 displays results for a 4 (Within Subject; Image; positive-social, positive-nonsocial, negative-social, and negative-nonsocial) by 2 (Between Subject; lonely vs. nonlonely) Repeated Measures ANOVA. Results indicate that participants showed different gazing behavior at positive, negative, social, and nonsocial images ([positive social = negative social] > positive nonsocial > negative nonsocial), for both gaze duration and number of fixations (see S3 Fig). Main effects and interaction effects for loneliness did not yield significance, indicating that lonely and nonlonely participants had similar gazing patterns at different types of images. Additionally, we looked at differences between lonely and non-lonely participants in first fixations on each of the images (see Table 4). Results showed that there was a main effect for type of image (Positive Social > Negative Social > Positive Nonsocial > Negative Nonsocial). No differences occurred between lonely and non-lonely participants. Thus, participants’ first fixations in both the lonely and nonlonely group were most often at the positive social images.

Task 4 (Video Task)

Table 3 shows the results of a 2 (Within Subjects; positive vs. negative valence) by 2 (Between Subjects; lonely vs. nonlonely) Repeated Measures ANOVA. Results indicate that there was a main effect for valence for both gaze duration and number of fixations (see S4 Fig). Participants gazed longer and more often at AOIs within negative video clips than within positive video clips. In addition, there was a Loneliness by Valence interaction for the number of fixations. As both loneliness and valence only held two levels, we could not use post-hoc comparisons to establish whether the difference in valence held for both groups. Therefore, we ran a Repeated Measures ANOVA with two levels (Valence; positive, negative) for the two loneliness groups separately. Results indicated that participants fixated on the AOIs of the negative video clips more often than the positive video clips in both the lonely (F(1,21) = 8.82, p = .007, part. η2 = .30) and the nonlonely group (F(1,21) = 24.74, p < .001, part. η2 = .54), but this effect was stronger for the nonlonely group. Thus, all participants tended to fixate more often on the facial area of actors in video clips with a negative content compared to video clips with a positive content, and this difference between positive and negative video clips was stronger for nonlonely than for lonely participants. When we did not control for depression and social anxiety, we no longer found a difference between lonely and nonlonely participants.

Curve Estimations

To examine whether loneliness may only be related to overt visual attention to social cues in the top range of loneliness scores, we used quadratic regression [5]. Within the lonely group, we predicted gazing duration and number of fixations from loneliness scores. None of the quadratic effects were significant, indicating that there was also no relation between loneliness and overt visual attention to social cues for extremely lonely participants.

Discussion

The goal of the present study was to examine whether lonely individuals show signs of hypervigilance for negative social information. We measured differences between lonely and nonlonely individuals in their overt visual attention to social cues. Results indicated that lonely participants did not differ from non-lonely participants in their overt visual attention to still images. Both lonely and nonlonely participants gazed longer and more often at the eyeregion of the face than at the mouth and nose region (Task 1, Face Task), especially in angry faces. When images with actors expressing happiness, fear, anger, and a neutral expression were presented simultaneously, participants gazed longer and more often at happy images than at other images, and gazed least at the angry images (Task 2, Emotion Array Task). In the third task (Social and Nonsocial Array Task), we found that participants looked longer and more often at social images than at nonsocial images, and looked longer and more often at positive images than at negative images. When participants viewed video clips of positive and negative conversations (Task 4, Video Task), they fixated longer and more often at the facial area of the actors in negative video clips compared to positive video clips. For the number of fixations, this effect was stronger for non-lonely participants than for lonely participants.

These findings seem to imply that lonely individuals do not differ from nonlonely individuals in one of the first steps of processing social cues, namely overt visual attention to social cues. The results in the present study are in contrast with findings in social anxiety research, which provide broad evidence that socially anxious individuals are hypervigilant for social threat. A review article revealed that, with very few exceptions, social anxiety is related to hypervigilance for socially threatening information (specifically facial expressions) in terms of heightened emotion recognition, interpretation, memory and overt and covert attention to threatening information, using a variety of stimuli and paradigms [45]. In social anxiety, evidence is found both for an overall hypervigilance for socially threatening information, and for initial hypervigilance followed by avoidance of these cues [46]. In the two studies that have been conducted on the relation between loneliness and visual attention to socially threatening information, both patterns have been found. Lonely children seemed to show difficulty to disengage from socially threatening information, whereas lonely adults seemed to show initial hypervigilance followed by avoidance of socially threatening cues [5,7]. In the current study, we did not find any link between loneliness and visual attention to social cues, either for positive or negative cues, across a variety of tasks. This could indicate that in loneliness, in contrast with social anxiety, the link with overt attention to social cues does not exist as clearly as in social anxiety.

This could indicate that biased processing of social information in lonely individuals emerges in later stages of social information processing, as seems to be the case in, for instance, depression and borderline personality disorder [25]. For instance, threatening images may be prioritized in encoding irrespective of the visual attention that is given to these cues [47]. In line with this, earlier research showed that lonely individuals seem to have greater activation of the visual cortex in response to viewing social negative images compared to negative images with objects, which indicates that lonely individuals may be more sensitive for these negative social cues [48]. Thus, although lonely individuals did not seem to give more overt attention to negative social cues based on the present study, their covert attention to these cues may still differ. Processing of these cues might get prioritized over other cues, and memory for these cues may be increased. Regarding our research, lonely participants may have interpreted the cues we presented to them more negatively than the nonlonely participants, may have had biased memory for these cues, or may have showed a different behavioral response to these cues. Indeed, earlier research shows ample evidence for differences such as these in later stages of social information processing, such as a hostile attribution bias, withdrawn behavior, and increased memory for social cues [4,7,49].

Thus, one possible explanation of our findings is that lonely individuals indeed do not show hypervigilance to social cues, but show only biases in other stages of information processing. Alternatively, lonely individuals may in fact be hypervigilant to social cues, but only under specific conditions. First, lonely individuals may only show increased overt attention to specific social cues that convey rejection. Indeed, earlier research indicated that for different forms of psychopathology, different types of stimuli may be relevant [50]. In the few other studies that have looked at the relation between loneliness and overt visual attention to social cues [5,7], the negative cues that were used were videotaped in schoolyards and depicted clear situations of rejection. In contrast, in our study we used a broad body of stimuli that included general facial expressions (Task 1—Face Task and 2—Emotion Array Task), negative social images such as violent behaviors (Task 3—Social and Nonsocial Array Task) and sad or angry conversations (Task 4—Video Task). Hence, these negative cues did not explicitly depict situations of rejection. Possibly, loneliness is only related to hypervigilance towards social threat in terms of rejection.

Second, lonely individuals may only show hypervigilance for negative social cues in situations and towards stimuli in which actual rejection or acceptance is at stake. Earlier research on social anxiety indicated that especially socially threatening information that was self-relevant seemed to activate biased processing of these cues [50]. Indeed, the stimulus material used by Qualter et al. [7] and Bangee et al. [5] showed scenes from a playground. These video clips may have been more ecologically valid than the images and videos used in the present study. Participants may have been more engaged with these video clips, leading to higher identification with the children displayed in these clips. Thus, there may in fact be a difference between lonely and nonlonely individuals in their overt visual attention to social cues, but we were unable to detect it as our tasks may not have been socially relevant enough to trigger hypervigilance for negative social cues. Some researchers argue that the study of social gaze behavior could benefit from designs using active participation in social interaction, as processing of more life-like social information could differ from information as being processed in a lab environment [51]. Future research is obviously needed to examine whether or not loneliness is in fact related to differential overt visual attention to social cues in real life interactions.

Third, loneliness may be only related to hypervigilance to negative social information for certain lonely individuals. For instance, it might be possible that loneliness is only related to hypervigilance for negative social cues in certain age groups whose social and emotional skills are still developing. Earlier research showed differences in patterns of overt visual attention to rejection cues between lonely children and older adolescents. Biases in overt visual attention to social cues may only arise in age-groups that were not included in the present study, such as children or older adolescents [5,7]. In addition, hypervigilance to negative social information may only arise in certain types of lonely individuals. For instance, possibly everyone who experiences some threat to the belonging regulation system may show increased attention to social cues in general, but this may shift to hypervigilance for negative social cues only for chronically lonely individuals [3,4,16,52]. In the present study, we only included young adults, and due to the correlational nature of our data, we were unable to distinguish temporary lonely from chronically lonely individuals. Future research could explore the possibility that only certain types of lonely individuals may be hypervigilant for negative social cues.

One of the strong points of the present study is the use of four different eye-tracking tasks, which allowed us to do a within-sample replication of our finding that loneliness does not seem to be related to overt visual attention to social cues. The overall main effects (i.e., gazing behavior of the entire sample) were in the direction we expect for a healthy population. For instance, in accordance with earlier research, we found highest gazing towards the eye region of the face in Task 1 (Face Task), and preference for social images over nonsocial images in Task 3 (Social and Nonsocial Task) [53,54]. Thus, we think that our tasks were well designed and suitable to detect differences in overt attention to negative social cues. Of course, because we only used one sample for all four tasks, there still is a possibility that our findings are specific for our sample, although we have no reason to assume that our sample of lonely individuals was not comparable to other samples.

One of the limitations of the present study is that we used only highly educated female participants. Although we have no reason to assume that visual attention to social cues may be different for males, or individuals from other educational backgrounds, the sample in the present study may not be generalizable to the general population. Therefore, future research should explore the relation between loneliness and visual attention in different samples such as males or individuals from a lower educational background. Another limitation of the present study is the use of non-lonely participants as a contrast group. Research on loneliness thus far has been aimed at the most lonely people. As a result, we cannot draw conclusions about possible differences between people who experience average, high or low feelings of loneliness. Future research might address the issue of whether nonlonely participants may show certain biases as well.

Future research should explore the possibility that differences in processing of social information may occur in later stages of social information processing. Moreover, in a real-life interaction task, lonely and nonlonely participants could be exposed to a confederate displaying negative or positive social cues. In such an experiment, behavioral responses of lonely and nonlonely participants could be measured. Future research could also benefit from combining an eye-tracking approach with an EEG approach. By combining these approaches, future research could reveal differences in overt and covert processing of social information by lonely and nonlonely individuals. Additionally, future research could examine whether loneliness might be related to hypervigilance to negative social cues, but only under specific conditions.

Conclusion

All in all, we did not find evidence that lonely individuals are hypervigilant for negative social cues. Both lonely and nonlonely individuals tend to gaze longer and more often at areas that convey most social information, such as the facial area and specifically the eye region. This could indicate that biased processing of social information in lonely individuals emerges in a later stage of social information processing. Alternatively, hypervigilance for negative social cues may only become apparent for specific cues (e.g., rejection cues), in specific situations (e.g., situations in which actual rejection or acceptance is at stake), or for certain lonely individuals (e.g., chronically lonely individuals). Notwithstanding alternative explanations, based on our research, we have no evidence for a relation between loneliness and hypervigilance for negative social cues.

Supporting Information

S1 Table. Slide Numbers for IAPS Images Task 3.

https://doi.org/10.1371/journal.pone.0125141.s001

(DOCX)

S2 Table. Mixed Model ANOVA Results for Gaze Duration in ms and Number of Fixations for All Tasks Including Covariates and Errors.

https://doi.org/10.1371/journal.pone.0125141.s002

(DOCX)

S3 Table. Means and Standard Deviations for Percentage of First Fixations for Total Sample, Lonely and Nonlonely Participants for Task 2 (Emotional Array Task) and Task 3.

https://doi.org/10.1371/journal.pone.0125141.s003

(DOCX)

S1 Fig. Corrected Total Fixation Duration at Eyes, Nose and Mouth in Task 1 (Face Task) for Lonely and Nonlonely Participants.

https://doi.org/10.1371/journal.pone.0125141.s004

(TIF)

S2 Fig. Total Fixation Duration at Emotions in Task 2 (Emotion Array Task) for Lonely and Nonlonely Participants.

https://doi.org/10.1371/journal.pone.0125141.s005

(TIF)

S3 Fig. Total Fixation Duration at Different Images in Task 3 (Social and Nonsocial Array Task) for Lonely and Nonlonely Participants.

https://doi.org/10.1371/journal.pone.0125141.s006

(TIF)

S4 Fig. Total Fixation Duration at AOIs in Task 4 (Video Task) for Lonely and Nonlonely Participants.

https://doi.org/10.1371/journal.pone.0125141.s007

(TIF)

Acknowledgments

We thank Hubert Voogd, Technical Support Group, Radboud University Nijmegen, The Netherlands, for his help in programming the eye-tracking tasks, and setting up the eye-tracking equipment.

Author Contributions

Conceived and designed the experiments: GL RS RE LG MV. Performed the experiments: GL. Analyzed the data: GL IC MV. Contributed reagents/materials/analysis tools: GL IC. Wrote the paper: GL RS IC RE LG MV. Designed software used in analysis: IC.

References

  1. 1. Perlman D, Peplau LA. Toward a social psychology of loneliness. In: Gillmour R, Duck S, editors. Personal relationships 3: Personal relationships in disorder. London: Academic Press; 1991. pp. 31–56.
  2. 2. Hawkley LC, Cacioppo JT. Loneliness matters: A theoretical and empirical review of consequences and mechanisms. Behav Med. 2010;40: 218–227. pmid:20652462
  3. 3. Cacioppo JT, Norris CJ, Decety J, Monteleone G, Nusbaum H. In the eye of the beholder: Individual differences in perceived social isolation predict regional brain activation to social stimuli. J Cognit Neurosci. 2009;21: 83–92.
  4. 4. Gardner WL, Pickett CL, Jefferis V, Knowles M. On the outside looking in: Loneliness and social monitoring. Pers Soc Psychol Bull. 2005;31: 1549–1560. pmid:16207773
  5. 5. Bangee M, Harris RA, Bridges N, Rotenberg KJ, Qualter P. Loneliness and attention to social threat in young adults: Findings from an eye tracker study. Pers Individ Dif. 2014;63: 16–23.
  6. 6. Mogg K, Bradley BP. A cognitive-motivational analysis of anxiety. Behav Res Ther. 1998;36: 809–848. pmid:9701859
  7. 7. Qualter P, Rotenberg K, Barrett L, Henzi P, Barlow A, Stylianou M. et al. Investigating hypervigilance for social threat of lonely children. J Abnorm Child Psychol. 2013;41: 325–338. pmid:22956297
  8. 8. Bar-Haim Y, Lamy D, Pergamin L, Bakermans-Kranenburg MJ, van Ijzendoorn MH. Threat-related attentional bias in anxious and nonanxious individuals: A meta-analytic study. PsyB. 2007;133: 1–24.
  9. 9. Bodenhausen GV, Hugenberg K. Attention, perception, and social cognition. In: Strack F, Förster J, editors. Social cognition: The basis of human interaction. Philadelphia: Psychology Press; 2009. pp. 1–22.
  10. 10. Amir N, Beard C, Burns M, Bomyea J. Attention modification program in individuals with generalized anxiety disorder. JAP. 2009;118: 28–33.
  11. 11. Cacioppo JT, Hawkley LC, Ernst JM, Burleson M, Berntson GG, Nouriani B, et al. Loneliness within a nomological net: An evolutionary perspective. J Res Pers. 2006;40: 1054–1085.
  12. 12. Weiss RS. Loneliness: The experience of emotional and social isolation. Camebridge, MA: MIT Press; 1979.
  13. 13. Baumeister RF, Leary MR. The need to belong: Desire for interpersonal attachments as a fundamental human motivation. PsyB. 1995;117: 497–529.
  14. 14. Maner JK, DeWall CN, Baumeister RF, Schaller M. Does social exclusion motivate interpersonal reconnection? Resolving the "porcupine problem". JPSP. 2007;92: 42–55.
  15. 15. Gardner WL, Pickett CL, Brewer MB. Social exclusion and selective memory: How the need to belong influences memory for social events. Pers Soc Psychol Bull. 2000;26: 486–496.
  16. 16. Pickett CL, Gardner WL, Knowles M. Getting a cue: The need to belong and enhanced sensitivity to social cues. Pers Soc Psychol Bull. 2004;30: 1095–1107. pmid:15359014
  17. 17. Christensen PN, Kashy DA. Perceptions of and by lonely people in initial social interaction. Pers Soc Psychol Bull. 1998;24: 322–329.
  18. 18. Jones WH, Sansone C, Helm B. Loneliness and interpersonal judgments. Pers Soc Psychol Bull. 1983;9: 437–441.
  19. 19. Hanley-Dunn P, Maxwell SE, Santos JF. Interpretation of interpersonal interactions: The influence of loneliness. Pers Soc Psychol Bull, 1985;11: 445–456.
  20. 20. Wittenberg MT, Reis HT. Loneliness, social skills and social perception. Pers Soc Psychol Bull. 1986;12: 121–130.
  21. 21. Dolan RJ, Vuilleumier P. Amygdala automaticity in emotional processing. In: ShinnickGallagher P, Pitkanen A, Shekhar A, Cahill L, editors. Amygdala in Brain Function: Bacic and Clinical Approaches; 2003. pp. 348–355.
  22. 22. Friesen CK, Kingstone A. The eyes have it! Reflexive orienting is triggered by nonpredictive gaze. Psychon Bull Rev. 1998;5: 490–495.
  23. 23. Horowitz TS, Fine EM, Fencsik DE, Yurgenson S, Wolfe JM. Fixational eye movements are not an index of covert attention. Psychol Sci. 2007;18: 356–363. pmid:17470262
  24. 24. Steinmetz KRM, Kensinger EA. The emotion-induced memory trade-off: More than an effect of overt attention? M&C. 2013;41: 69–81.
  25. 25. Von Ceumern-Lindenstjerna IA, Brunner R, Parzer P, Mundt C, Fiedler P, Resch F. Attentional bias in later stages of emotional information processing in female adolescents with borderline personality disorder. Psychopathology. 2010;43: 25–32. pmid:19893341
  26. 26. Horsley TA, Castro BO, Schoot M. In the eye of the beholder: Eye-tracking assessment of social information processing in aggressive behavior. J Abnorm Child Psychol. 2009;38: 587–599.
  27. 27. Ekman P. Facial expression and emotion. AmP. 1993;48: 384–392.
  28. 28. Keltner D, Kring AM. Emotion, social function, and psychopathology. Rev Gen Psychol. 1998;2: 320–342.
  29. 29. Klein JT, Shepherd SV, Platt ML. Social attention and the brain. Curr Biol. 2009;19: R958–R962. pmid:19889376
  30. 30. Itier RJ, Batty M. Neural bases of eye and gaze processing: The core of social cognition. Neurosci Biobehav Rev. 2009;33: 843–863. pmid:19428496
  31. 31. Walker-Smith GJ, Gale AG, Findlay JM. Eye-movement strategies involved in face perception. Perception. 1977;6: 313–326. pmid:866088
  32. 32. Ekman P, Friesen WV. Unmasking the face. A guide to recognizing emotions from facial clues. Oxford: Prentice-Hall; 1975. pmid:14757335
  33. 33. Staugaard SR. Threatening faces and social anxiety: A literature review. Clin Psychol Rev. 2010;30: 669–690. pmid:20554362
  34. 34. Craig AA, Harvey RJ. Brief report: Discriminating between problems in living: An examination of measures of depression, loneliness, shyness, and social anxiety. J Soc Clin Psychol. 1988;6: 482–491.
  35. 35. Heinrich L, Gullone E. The clinical significance of loneliness: A literature review. Clin Psychol Rev. 2006;26: 695–718. pmid:16952717
  36. 36. Langner O, Dotsch R, Bijlstra G, Wigboldus DHJ, Hawk ST, Van Knippenberg A. Presentation and validation of the Radboud Faces Database. Cogn Emot. 2010;24: 1377–1388.
  37. 37. Beevers CG, Lee HJ, Wells TT, Ellis AJ, Telch MJ. Association of predeployment gaze bias for emotion stimuli with later symptoms of PTSD and depression in soldiers deployed in Iraq. Am J Psychiatry. 2011;168: 735–741. pmid:21454916
  38. 38. Lang PJ, Bradley MM, Cuthbert BN. International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Affective ratings of pictures and instruction manual. Technical Report A-8. University of Florida, Gainesville, FL; 2008.
  39. 39. Smeets JBJ, Hooge ITC. Nature of variability in saccades. J Neurophysiol. 2003;90: 12–20. pmid:12611965
  40. 40. Theeuwes J, Godijn R. Irrelevant singletons capture attention: Evidence from inhibition of return. Percept Psychophys. 2002;64: 764–770. pmid:12201335
  41. 41. Russell DW. UCLA Loneliness Scale (Version 3): Reliability, validity, and factor structure. J Person Assess. 1996;66: 20–40.
  42. 42. Radloff LS. The CES-D scale: A self-report depression scale for research in the general population. Appl Psychol Meas. 1977;1: 385–401.
  43. 43. Connor KM, Davidson JRT, Erik Churchill L, Sherwood A, Foa E, Weisler RH. Psychometric properties of the social phobia inventory (SPIN). New self- rating scale. Br J Psychiatry. 2000;176: 379–386. pmid:10827888
  44. 44. Hall J, Hutton S, Morgan M. Sex differences in scanning faces: Does attention to the eyes explain female superiority in facial expression recognition? Cogn Emot. 2010;24: 629–637.
  45. 45. Machado-de-Sousa JP, Arrais KC, Alves NT, Chagas MHN, de Meneses-Gaya C, Crippa JAD et al. Facial affect processing in social anxiety: Tasks and stimuli. J Neurosci Methods. 2010;193: 1–6. pmid:20800619
  46. 46. Buckner JD, DeWall CN, Schmidt NB, Maner JK. A tale of two threats: Social anxiety and attention to social threat as a function of social exclusion and non-exclusion threats. Cognit Ther Res. 2009;34: 449–455.
  47. 47. Anderson AK, Christoff K, Panitz D, De Rosa E, Gabrieli JDE. Neural correlates of the automatic processing of threat facial signals. J Neurosci. 2003;23: 5627–5633. pmid:12843265
  48. 48. Cacioppo JT, Hawkley LC. Perceived social isolation and cognition. Trends Cogn Sci. 2009;13: 447–454. pmid:19726219
  49. 49. Watson J, Nesdale D. Rejection sensitivity, social withdrawal, and loneliness in young adults. J Appl Soc Psychol. 2012;42: 1984–2005.
  50. 50. Gotlib IH, Kasch KL, Traill S, Joormann J, Arnow BA, Johnson SL. Coherence and specificity of information-processing biases in depression and social phobia. JAP. 2004;113: 386–398.
  51. 51. Pfeiffer UJ, Vogeley K, Schilbach L. From gaze cueing to dual eye-tracking: Novel approaches to investigate the neural correlates of gaze in social interaction. Neurosci Biobehav Rev. 2013;37: 2516–2528. pmid:23928088
  52. 52. Pickett CL, Gardner WL. The social monitoring system: Enhanced sensitivity to social cues and information as an adaptive response to social exclusion and belonging need. In: Williams KD, Forgas JP, Von Hippel W, editors. The social outcast: Ostracism, social exclusion, rejection, and bullying. New York: Psychology Press; 2005. pp. 213–226.
  53. 53. Murphy NA, Isaacowitz DM. Preferences for emotional information in older and younger adults: A meta-analysis of memory and attention tasks. Psychol Aging. 2008;23: 263–286. pmid:18573002
  54. 54. Klin A, Jones W, Schultz R, Volkmar F, Cohen D. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Arch Gen Psychiatry. 2002;59: 809–816. pmid:12215080