Communicative Sciences and Disorders, New York University
Background: Prior to beginning my Ph.D. program, I received my undergraduate degree in Biology and Psychology at Washington University in St. Louis and my master’s degree in Communication Sciences and Disorders at Fontbonne University. After receiving my master’s degree, I worked as a school speech-language pathologist serving autistic students in grades PK-5 for six years. This experience sparked my interest in evidence-based practice for AAC for autistic children and inspired me to pursue a Ph.D.
Current Interests: My current research is investigating word learning in minimally verbal children and adolescents with a diagnosis of ASD. Using eye-tracking, we are looking at the match (or mismatch) between pointing responses and eye gaze, as well as what supports can improve word learning for this population. I am also interested in the intersection between apraxia and autism and what therapies can improve communication and quality of life for these individuals.
Dissertation Chair: Dr. Christina Reuterskiöld
Sample Presentation/Publication: Word Learning with Orthographic Support in Minimally Verbal Children with Autism (SRCLD presentation). Orthographic Support for Word Learning in Clinical Populations: A Systematic Review (LSHSS publication). In my clinical practice, I noticed that oftentimes, autistic children would be highly interested by letters and words (i.e., hyperlexia) and would sometimes say words when the written word was present. Because of these experiences, I wanted to determine if minimally verbal children with a diagnosis of autism demonstrate more efficient word learning when text support is provided. I’m hoping that this research translates to more of a literacy focus in therapy for school-aged autistic children.
Presentation Topic: Orthographic support for word learning in minimally verbal autistic children & adolescents
Discussion Topic(s):
- Recruitment, even from a national sample, has been extremely slow and difficult for this population. We have partnered with SFARI to reach a broader audience but still have not reached our target participation numbers.
- I would also love to hear suggestions about how to mathematically compare the eye-gaze data extracted manually and automatically to determine if the same results are being seen.