DESCRIPTION (provided by applicant): Children with autism spectrum disorder (ASD) have often been observed to express affect either weakly, only in one modality at a time (e.g., choice of words) or in multiple modalities but not in a coordinated fashion. These difficulties in crossmodal integration of affect expression may have roots in certain global characteristics of brain structure in autism, specifically atypical interconnectivity between brain areas. Poor crossmodal integration of affect expression may also play a critical role in communications difficulties that are well documented in ASD. Not understanding how e.g., facial expression can be used to modify the interpretation of words undermines social reciprocity. Impairment in crossmodal integration of affect is thus a potentially powerful explanatory concept in ASD. The study will provide much needed data on expressive crossmodal integration impairment in ASD and its association with receptive croosmodal integration impairment, using innovative technologies to create stimuli for a judgmental procedure that makes possible independent assessment of the individual modalities;these technologies are critical because human observers are not able to selectively filter out modalities. In addition, the vocal measures and the audiovisual database lay the essential groundwork for the next step: Creation of audiovisual analysis methods for automated assessment of expressive crossmodal integration. These methods will be applied to audio-visual recordings of a structured play situation;the child will participate in this play situation twice, once with a caregiver and once with an examiner. This procedure for measuring expressive crossmodal integration will be complemented by a procedure for measuring crossmodal integration of affect processing using dynamic talking-face stimuli in which the audio and video stream are recombined (preserving perfect synchrony of the facial and vocal channels) to create stimuli with congruent vs. incongruent affect expression. Both procedures will be applied to three groups: Children with ASD, children with Developmental Language Disorder (DLD), and typically developing children;ages will be six to ten. Our study would be the first to perform a comprehensive analysis of crossmodal integration of affect expression in ASD. If the study confirms the existence of these impairments in ASD, and provides a detailed picture of these impairments, this could (i) guide brain studies to specifically target areas responsible for affect expression;(i) provide a deeper understanding of impairments in social reciprocity;and (ii) help design remedial programs for intensive training of under-used or incoordinated expressive modalities. The study this contributes to etiology diagnosis, and treatment.
|Effective start/end date||7/1/09 → 6/30/12|
- National Institutes of Health: $191,367.00
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.