PLOS ONE: [sortOrder=DATE_NEWEST_FIRST, from=authorLink, sort=Relevance, q=author:"Astrid M L Kappers"]PLOShttps://journals.plos.org/plosone/webmaster@plos.orgaccelerating the publication of peer-reviewed sciencehttps://journals.plos.org/plosone/search/feed/atom?sortOrder=DATE_NEWEST_FIRST&unformattedQuery=author:%22Astrid%20M%20L%20Kappers%22&from=authorLink&sort=RelevanceAll PLOS articles are Open Access.https://journals.plos.org/plosone/resource/img/favicon.icohttps://journals.plos.org/plosone/resource/img/favicon.ico2024-03-28T22:55:43ZIndividual differences in cognitive processing for roughness rating of fine and coarse texturesMakiko NatsumeYoshihiro TanakaAstrid M. L. Kappers10.1371/journal.pone.02114072019-01-30T14:00:00Z2019-01-30T14:00:00Z<p>by Makiko Natsume, Yoshihiro Tanaka, Astrid M. L. Kappers</p>
Previous studies have demonstrated that skin vibration is an important factor affecting the roughness perception of fine textures. For coarse textures, the determining physical factor is much less clear and there are indications that this might be participant-dependent. In this paper, we focused on roughness perception of both coarse and fine textures of different materials (glass particle surfaces and sandpapers). We investigated the relationship between subjective roughness ratings and three physical parameters (skin vibration, friction coefficient, and particle size) within a group of 30 participants. Results of the glass particle surfaces showed both spatial information (particle size) and temporal information (skin vibration) had a high correlation with subjective roughness ratings. The former correlation was slightly but significantly higher than the latter. The results also indicated different weights of temporal information and spatial information for roughness ratings among participants. Roughness ratings of a different material (sandpaper versus glass particles) could be either larger, similar or smaller, indicating differences among individuals. The best way to describe our results is that in their perceptual evaluation of roughness, different individuals weight temporal information, spatial information, and other mechanical properties differently.Correcting for Visuo-Haptic Biases in 3D Haptic GuidanceFemke E. van BeekIrene A. KulingEli BrennerWouter M. Bergmann TiestAstrid M. L. Kappers10.1371/journal.pone.01587092016-07-20T14:00:00Z2016-07-20T14:00:00Z<p>by Femke E. van Beek, Irene A. Kuling, Eli Brenner, Wouter M. Bergmann Tiest, Astrid M. L. Kappers</p>
Visuo-haptic biases are observed when bringing your unseen hand to a visual target. The biases are different between, but consistent within participants. We investigated the usefulness of adjusting haptic guidance to these user-specific biases in aligning haptic and visual perception. By adjusting haptic guidance according to the biases, we aimed to reduce the conflict between the modalities. We first measured the biases using an adaptive procedure. Next, we measured performance in a pointing task using three conditions: 1) visual images that were adjusted to user-specific biases, without haptic guidance, 2) veridical visual images combined with haptic guidance, and 3) shifted visual images combined with haptic guidance. Adding haptic guidance increased precision. Combining haptic guidance with user-specific visual information yielded the highest accuracy and the lowest level of conflict with the guidance at the end point. These results show the potential of correcting for user-specific perceptual biases when designing haptic guidance.Haptic Exploratory Behavior During Object Discrimination: A Novel Automatic Annotation MethodSander E. M. JansenWouter M. Bergmann TiestAstrid M. L. Kappers10.1371/journal.pone.01170172015-02-06T14:00:00Z2015-02-06T14:00:00Z<p>by Sander E. M. Jansen, Wouter M. Bergmann Tiest, Astrid M. L. Kappers</p>
In order to acquire information concerning the geometry and material of handheld objects, people tend to execute stereotypical hand movement patterns called haptic Exploratory Procedures (EPs). Manual annotation of haptic exploration trials with these EPs is a laborious task that is affected by subjectivity, attentional lapses, and viewing angle limitations. In this paper we propose an automatic EP annotation method based on position and orientation data from motion tracking sensors placed on both hands and inside a stimulus. A set of kinematic variables is computed from these data and compared to sets of predefined criteria for each of four EPs. Whenever all criteria for a specific EP are met, it is assumed that that particular hand movement pattern was performed. This method is applied to data from an experiment where blindfolded participants haptically discriminated between objects differing in hardness, roughness, volume, and weight. In order to validate the method, its output is compared to manual annotation based on video recordings of the same trials. Although mean pairwise agreement is less between human-automatic pairs than between human-human pairs (55.7% vs 74.5%), the proposed method performs much better than random annotation (2.4%). Furthermore, each EP is linked to a specific object property for which it is optimal (e.g., Lateral Motion for roughness). We found that the percentage of trials where the expected EP was found does not differ between manual and automatic annotation. For now, this method cannot yet completely replace a manual annotation procedure. However, it could be used as a starting point that can be supplemented by manual annotation.Haptic Discrimination of DistanceFemke E. van BeekWouter M. Bergmann TiestAstrid M. L. Kappers10.1371/journal.pone.01047692014-08-12T14:00:00Z2014-08-12T14:00:00Z<p>by Femke E. van Beek, Wouter M. Bergmann Tiest, Astrid M. L. Kappers</p>
While quite some research has focussed on the accuracy of haptic perception of distance, information on the precision of haptic perception of distance is still scarce, particularly regarding distances perceived by making arm movements. In this study, eight conditions were measured to answer four main questions, which are: what is the influence of reference distance, movement axis, perceptual mode (active or passive) and stimulus type on the precision of this kind of distance perception? A discrimination experiment was performed with twelve participants. The participants were presented with two distances, using either a haptic device or a real stimulus. Participants compared the distances by moving their hand from a start to an end position. They were then asked to judge which of the distances was the longer, from which the discrimination threshold was determined for each participant and condition. The precision was influenced by reference distance. No effect of movement axis was found. The precision was higher for active than for passive movements and it was a bit lower for real stimuli than for rendered stimuli, but it was not affected by adding cutaneous information. Overall, the Weber fraction for the active perception of a distance of 25 or 35 cm was about 11% for all cardinal axes. The recorded position data suggest that participants, in order to be able to judge which distance was the longer, tried to produce similar speed profiles in both movements. This knowledge could be useful in the design of haptic devices.Contact Force and Scanning Velocity during Active Roughness PerceptionYoshihiro TanakaWouter M. Bergmann TiestAstrid M. L. KappersAkihito Sano10.1371/journal.pone.00933632014-03-27T14:00:00Z2014-03-27T14:00:00Z<p>by Yoshihiro Tanaka, Wouter M. Bergmann Tiest, Astrid M. L. Kappers, Akihito Sano</p>
Haptic perception is bidirectionally related to exploratory movements, which means that exploration influences perception, but perception also influences exploration. We can optimize or change exploratory movements according to the perception and/or the task, consciously or unconsciously. This paper presents a psychophysical experiment on active roughness perception to investigate movement changes as the haptic task changes. Exerted normal force and scanning velocity are measured in different perceptual tasks (discrimination or identification) using rough and smooth stimuli. The results show that humans use a greater variation in contact force for the smooth stimuli than for the rough stimuli. Moreover, they use higher scanning velocities and shorter break times between stimuli in the discrimination task than in the identification task. Thus, in roughness perception humans spontaneously use different strategies that seem effective for the perceptual task and the stimuli. A control task, in which the participants just explore the stimuli without any perceptual objective, shows that humans use a smaller contact force and a lower scanning velocity for the rough stimuli than for the smooth stimuli. Possibly, these strategies are related to aversiveness while exploring stimuli.Influence of Shape on the Haptic Size AftereffectAstrid M. L. KappersWouter M. Bergmann Tiest10.1371/journal.pone.00887292014-02-19T14:00:00Z2014-02-19T14:00:00Z<p>by Astrid M. L. Kappers, Wouter M. Bergmann Tiest</p>
Recently, we showed a strong haptic size aftereffect by means of a size bisection task: after adaptation to a large sphere, subsequently grasped smaller test spheres felt even smaller, and vice versa. In the current study, we questioned whether the strength of this aftereffect depends on shape. In four experimental conditions, we determined the aftereffect after adaptation to spheres and tetrahedra and subsequent testing also with spheres and tetrahedra. The results showed a clear influence of shape: the haptic aftereffect was much stronger if adaptation and test stimuli were identical in shape than if their shapes were different. Therefore, it would be more appropriate to term such aftereffects haptic shape-size aftereffects, as size alone could not be the determining factor. This influence of shape suggests that higher cortical areas are involved in this aftereffect and that it cannot be due to adaptation of peripheral receptors. An additional finding is that the geometric property or combination of properties participants use in the haptic size bisection task varies widely over participants, although participants themselves are quite consistent.Integration and Disruption Effects of Shape and Texture in Haptic SearchVonne van PolanenWouter M. Bergmann TiestAstrid M. L. Kappers10.1371/journal.pone.00702552013-07-22T14:00:00Z2013-07-22T14:00:00Z<p>by Vonne van Polanen, Wouter M. Bergmann Tiest, Astrid M. L. Kappers</p>
In a search task, where one has to search for the presence of a target among distractors, the target is sometimes easily found, whereas in other searches it is much harder to find. The performance in a search task is influenced by the identity of the target, the identity of the distractors and the differences between the two. In this study, these factors were manipulated by varying the target and distractors in shape (cube or sphere) and roughness (rough or smooth) in a haptic search task. Participants had to grasp a bundle of items and determine as fast as possible whether a predefined target was present or not. It was found that roughness and edges were relatively salient features and the search for the presence of these features was faster than for their absence. If the task was easy, the addition of these features could also disrupt performance, even if they were irrelevant for the search task. Another important finding was that the search for a target that differed in two properties from the distractors was faster than a task with only a single property difference, although this was only found if the two target properties were non-salient. This means that shape and texture can be effectively integrated. Finally, it was found that edges are more beneficial to a search task than disrupting, whereas for roughness this was the other way round.Aging and Curvature Discrimination from Static and Dynamic TouchJ. Farley NormanAstrid M. L. KappersJacob R. CheesemanCecilia RonningKelsey E. ThomasonMichael W. BaxterAutum B. CallowayDavora N. Lamirande10.1371/journal.pone.00685772013-07-02T14:00:00Z2013-07-02T14:00:00Z<p>by J. Farley Norman, Astrid M. L. Kappers, Jacob R. Cheeseman, Cecilia Ronning, Kelsey E. Thomason, Michael W. Baxter, Autum B. Calloway, Davora N. Lamirande</p>
Two experiments evaluated the ability of 30 older and younger adults to discriminate the curvature of simple object surfaces from static and dynamic touch. The ages of the older adults ranged from 66 to 85 years, while those of the younger adults ranged from 20 to 29 years. For each participant in both experiments, the minimum curvature magnitude needed to reliably discriminate between convex and concave surfaces was determined. In Experiment 1, participants used static touch to make their judgments of curvature, while dynamic touch was used in Experiment 2. When static touch was used to discriminate curvature, a large effect of age occurred (the thresholds were 0.67 & 1.11/m for the younger and older participants, respectively). However, when participants used dynamic touch, there was no significant difference between the ability of younger and older participants to discriminate curvature (the thresholds were 0.58 & 0.59/m for the younger and older participants, respectively). The results of the current study demonstrate that while older adults can accurately discriminate surface curvature from dynamic touch, they possess significant impairments for static touch.Assessment of Night Vision Problems in Patients with Congenital Stationary Night BlindnessMieke M. C. BijveldMaria M. van GenderenFrank P. HoebenAmir A. KatzinRuth M. A. van NispenFrans C. C. RiemslagAstrid M. L. Kappers10.1371/journal.pone.00629272013-05-03T14:00:00Z2013-05-03T14:00:00Z<p>by Mieke M. C. Bijveld, Maria M. van Genderen, Frank P. Hoeben, Amir A. Katzin, Ruth M. A. van Nispen, Frans C. C. Riemslag, Astrid M. L. Kappers</p>
Congenital Stationary Night Blindness (CSNB) is a retinal disorder caused by a signal transmission defect between photoreceptors and bipolar cells. CSNB can be subdivided in CSNB2 (rod signal transmission reduced) and CSNB1 (rod signal transmission absent). The present study is the first in which night vision problems are assessed in CSNB patients in a systematic way, with the purpose of improving rehabilitation for these patients. We assessed the night vision problems of 13 CSNB2 patients and 9 CSNB1 patients by means of a questionnaire on low luminance situations. We furthermore investigated their dark adapted visual functions by the Goldmann Weekers dark adaptation curve, a dark adapted static visual field, and a two-dimensional version of the “Light Lab”. In the latter test, a digital image of a living room with objects was projected on a screen. While increasing the luminance of the image, we asked the patients to report on detection and recognition of objects. The questionnaire showed that the CSNB2 patients hardly experienced any night vision problems, while all CSNB1 patients experienced some problems although they generally did not describe them as severe. The three scotopic tests showed minimally to moderately decreased dark adapted visual functions in the CSNB2 patients, with differences between patients. In contrast, the dark adapted visual functions of the CSNB1 patients were more severely affected, but showed almost no differences between patients. The results from the “2D Light Lab” showed that all CSNB1 patients were blind at low intensities (equal to starlight), but quickly regained vision at higher intensities (full moonlight). Just above their dark adapted thresholds both CSNB1 and CSNB2 patients had normal visual fields. From the results we conclude that night vision problems in CSNB, in contrast to what the name suggests, are not conspicuous and generally not disabling.Haptic Spatial Configuration Learning in Deaf and Hearing IndividualsRick van DijkAstrid M. L. KappersAlbert Postma10.1371/journal.pone.00613362013-04-11T14:00:00Z2013-04-11T14:00:00Z<p>by Rick van Dijk, Astrid M. L. Kappers, Albert Postma</p>
The present study investigated haptic spatial configuration learning in deaf individuals, hearing sign language interpreters and hearing controls. In three trials, participants had to match ten shapes haptically to the cut-outs in a board as fast as possible. Deaf and hearing sign language users outperformed the hearing controls. A similar difference was observed for a rotated version of the board. The groups did not differ, however, on a free relocation trial. Though a significant sign language experience advantage was observed, comparison to results from a previous study testing the same task in a group of blind individuals showed it to be smaller than the advantage observed for the blind group. These results are discussed in terms of how sign language experience and sensory deprivation benefit haptic spatial configuration processing.Binding in Haptics: Integration of “What” and “Where” Information in Working Memory for Active TouchFranco DeloguWouter M. Bergmann TiestTanja C. W. NijboerAstrid M. L. KappersAlbert Postma10.1371/journal.pone.00556062013-02-06T14:00:00Z2013-02-06T14:00:00Z<p>by Franco Delogu, Wouter M. Bergmann Tiest, Tanja C. W. Nijboer, Astrid M. L. Kappers, Albert Postma</p>
Information about the identity and the location of perceptual objects can be automatically integrated in perception and working memory (WM). Contrasting results in visual and auditory WM studies indicate that the characteristics of feature-to-location binding can vary according to the sensory modality of the input. The present study provides first evidence of binding between “what” and “where” information in WM for haptic stimuli. In an old-new recognition task, blindfolded participants were presented in their peripersonal space with sequences of three haptic stimuli varying in texture and location. They were then required to judge if a single probe stimulus was previously included in the sequence. Recall was measured both in a condition in which both texture and location were relevant for the task (Experiment 1) and in two conditions where only one feature had to be recalled (Experiment 2). Results showed that when both features were task-relevant, even if the association of location and texture was neither necessary nor required to perform the task, participants exhibited a recall advantage in conditions in which the location and the texture of the target probe was kept unaltered between encoding and recall. By contrast, when only one feature was task-relevant, the concurrent feature did not influence the recall of the target feature. We conclude that attention to feature binding is not necessary for the emergence of feature integration in haptic WM. For binding to take place, however, it is necessary to encode and maintain in memory both the identity and the location of items.Haptic Search for Hard and Soft SpheresVonne van PolanenWouter M. Bergmann TiestAstrid M. L. Kappers10.1371/journal.pone.00452982012-10-08T14:00:00Z2012-10-08T14:00:00Z<p>by Vonne van Polanen, Wouter M. Bergmann Tiest, Astrid M. L. Kappers</p>
In this study the saliency of hardness and softness were investigated in an active haptic search task. Two experiments were performed to explore these properties in different contexts. In Experiment 1, blindfolded participants had to grasp a bundle of spheres and determine the presence of a hard target among soft distractors or vice versa. If the difference in compliance between target and distractors was small, reaction times increased with the number of items for both features; a serial strategy was found to be used. When the difference in compliance was large, the reaction times were independent of the number of items, indicating a parallel strategy. In Experiment 2, blindfolded participants pressed their hand on a display filled with hard and soft items. In the search for a soft target, increasing reaction times with the number of items were found, but the location of target and distractors appeared to have a large influence on the search difficulty. In the search for a hard target, reaction times did not depend on the number of items. In sum, this showed that both hardness and softness are salient features.