Publications|Journals

Comparison of teaching methods for the emergence and maintenance of untaught relations in foreign language vocabulary acquisition: A systematic replication

Published in Journal of Applied Behavior Analysis, 2024

This study examined how different foreign language teaching interactions shape emergent verbal relations, offering insights for designing technology-assisted learning systems that leverage behavioral principles.

Recommended citation: Yamaguchi, M. & Matsuda, S. (2024). Comparison of teaching methods for the emergence and maintenance of untaught relations in foreign language vocabulary acquisition: A systematic replication. Journal of Applied Behavior Analysis, 2024, 57(3), 763-775. https://doi.org/10.1002/jaba.1075
Download Paper

Assessment of Speech in Children and Adults With Selective Mutism: A Systematic Review

Published in F1000Research, 2023

This systematic review synthesized methods for evaluating speech in children and adults with selective mutism (article in Japanese), offering insights for developing evidence-based tools and HCI systems that support assessment and intervention.

Recommended citation: Toma, Y., & Matsuda, S. (2023). Assessment of speech in children and adults with selective mutism: A systematic review. F1000Research, 11, 847. https://doi.org/10.12688/f1000research.113302.4
Download Paper

Can Facial Expressions Induce Haptic Perception?

Published in IEEE Transactions on Haptics, 2023

This study investigated whether viewing facial expressions can evoke haptic sensations, highlighting cross-modal links between vision and touch and offering implications for multimodal HCI and affective interface design.

Recommended citation: Matsuyama, N., Matsuda, S., & Hachisu, T. (2023). Can facial expressions induce haptic perception? IEEE Transactions on Haptics. https://doi.org/10.1109/TOH.2023.3275657
Download Paper

Developmental trajectories of challenging behaviors reported retrospectively by Japanese parents of adult children with intellectual disabilities

Published in International Journal of Developmental Disabilities, 2022

This study examined life stage differences in challenging behaviors among individuals with intellectual disabilities, highlighting developmental patterns that can inform the timing and design of technology-assisted interventions.

Recommended citation: Inoue, M., Gomi, Y., & Matsuda, S. (2022). Developmental trajectories of challenging behaviors reported retrospectively by Japanese parents of adult children with intellectual disabilities. International Journal of Developmental Disabilities, 70(2), 287-295. https://doi.org/10.1080/20473869.2022.2087450
Download Paper

Smiles as a Signal of Prosocial Behaviors Toward the Robot in the Therapeutic Setting for Children with Autism Spectrum Disorder

Published in Frontiers in Robotics and AI, 2021

This study examined how children with ASD expressed prosocial behaviors toward a therapeutic robot, showing that smiles served as a key signal—offering insights for HCI and HRI systems that leverage affective cues to foster social engagement.

Recommended citation: Kim, S., Hirokawa, M., Matsuda, S., Funahashi, A., & Suzuki, K. (2021). Smiles as a signal of prosocial behaviors toward the robot in the therapeutic setting for children with autism spectrum disorder. Frontiers in Robotics and AI, 8, 599755. https://doi.org/10.3389/frobt.2021.599755
Download Paper

Comparing Reinforcement Values of Facial Expressions: An Eye-Tracking Study

Published in The Psychological Record, 2019

This study applied gaze-contingent reinforcement to evaluate how positive and negative facial expressions guide visual fixation, demonstrating eye tracking as a novel tool for quantifying the reward value of social stimuli.

Recommended citation: Matsuda, S., Omori, T., McCleery, J. P., & Yamamoto, J. (2019). Comparing reinforcement values of facial expressions: An eye-tracking study. The Psychological Record, 69(3), 393-400. https://doi.org/10.1007/s40732-019-00330-z
Download Paper

Effect of Sensory Feedback on Turn-Taking Using Paired Devices for Children with ASD

Published in Multimodal Technologies and Interaction, 2018

This study evaluated paired interactive devices that provided sensory feedback to support turn-taking in children with ASD, illustrating how multimodal technologies can foster social interaction within HCI-driven intervention designs.

Recommended citation: Nuñez, E., Matsuda, S., Hirokawa, M., Yamamoto, J., & Suzuki, K. (2018). Effect of sensory feedback on turn-taking using paired devices for children with ASD. Multimodal Technologies and Interaction, 2, 61. https://doi.org/10.3390/mti2040061
Download Paper

FaceLooks: A Smart Headband for Signaling Face-to-Face Behavior

Published in Sensors, 2018

This study introduced FaceLooks, a wearable smart headband that detects and signals face-to-face interactions, highlighting opportunities for HCI systems to monitor and enhance social engagement in real-world contexts.

Recommended citation: Hachisu, T., Pan, Y., Matsuda, S., Bourreau, B., & Suzuki, K. (2018). FaceLooks: A smart headband for signaling face-to-face behavior. Sensors, 18, 2066. https://doi.org/10.3390/s18072066
Download Paper

Facilitating Social Play for Children with PDDs: Effects of Paired Robotic Devices

Published in Frontiers in Psychology, 2017

This study explored robotic feedback (light and vibration) as a method to support social play in children with ASD, highlighting the potential of COLOLO devices for technology-assisted intervention.

Recommended citation: Matsuda, S., Nunez, E., Hirokawa, M., Yamamoto, J. & Suzuki, K. (2017). Facilitating social play for children with PDDs: Effects of paired robotic devices. Frontiers in Psychology, 2017, 8, 1029. https://doi.org/10.3389/fpsyg.2017.01029
Download Paper

Prefrontal Function Engaging in External-Focused Attention in 5–6-Month-Old Infants: A Comparison With Self-Focused Attention

Published in Frontiers in Human Neuroscience, 2017

Using fNIRS with 5–6-month-old infants, this study compared prefrontal activation during external- versus self-focused attention, offering insights for HCI on how early attentional mechanisms shape human interaction design.

Recommended citation: Xu, M., Hoshino, E., Yatabe, K., Matsuda, S., Sato, H., Maki, A., Yoshimura, M., & Minagawa, Y. (2017). Prefrontal function engaging in external-focused attention in 5–6-month-old infants: A comparison with self-focused attention. Frontiers in Human Neuroscience, 10, 676. https://doi.org/10.3389/fnhum.2016.00676
Download Paper

Emotion Comprehension in Intramodal and Cross-Modal Matching: A Preliminary Comparison Between Children with Autism Spectrum Disorders and Those with Williams Syndrome

Published in Journal of Special Education Research, 2015

This study directly compared children with ASD and Williams syndrome on intramodal and cross-modal emotion matching, showing that ASD-specific deficits arise in auditory-visual integration, with implications for multimodal HCI design.

Recommended citation: Matsuda, S., & Yamamoto, J. (2015). Emotion comprehension in intramodal and cross-modal matching: A preliminary comparison between children with autism spectrum disorders and those with Williams syndrome. Journal of Special Education Research, 4(1), 1-8. https://doi.org/10.6033/specialeducation.4.1
Download Paper

Gaze Behavior of Children with ASD toward Pictures of Facial Expressions

Published in Autism Research and Treatment, 2015

This study examined gaze responses to facial expressions in children with ASD, showing that autism severity modulated visual attention, with implications for adaptive affective computing and HCI design.

Recommended citation: Matsuda, S., Minagawa, Y., & Yamamoto, J. (2015). Gaze behavior of children with ASD toward pictures of facial expressions. Autism Research and Treatment, Article ID 617190, 1-8. https://doi.org/10.1155/2015/617190
Download Paper

Intramodal and Cross-Modal Matching of Emotional Expression in Young Children with Autism Spectrum Disorders

Published in Research in Autism Spectrum Disorders, 2014

This study compared intramodal and cross-modal matching of emotional expressions in young children with ASD, showing intact visual-visual performance but reduced accuracy in auditory-visual matching, highlighting modality-specific challenges for emotion processing.

Recommended citation: Matsuda, S. & Yamamoto, J. (2015). Intramodal and cross-modal matching of emotional expression in young children with autism spectrum disorders. Research in Autism Spectrum Disorders, 10, 109-115.
Download Paper

Computer-based intervention for inferring facial expressions from the socio-emotional context in two children with autism spectrum disorders

Published in Research in Autism Spectrum Disorders, 2014

This study used computer-based cross-modal MTS training with socio-emotional movies and facial expressions, showing that children with ASD can acquire and generalize emotion recognition skills through interactive technologies.

Recommended citation: Matsuda, S. & Yamamoto, J. (2014). Computer-based Intervention for Inferring Facial Expressions from the Socio-emotional Context in Two Children with Autism Spectrum Disorders. Research in Autism Spectrum Disorders, 8(8), 944-950. https://doi.org/10.1016/j.rasd.2014.04.010
Download Paper

Intervention for increasing the comprehension of affective prosody in children with autism spectrum disorders

Published in Research in Autism Spectrum Disorder, 2013

This study trained children with ASD to match affective prosody with facial expressions using cross-modal MTS, showing potential for designing multimodal interfaces that enhance emotion comprehension.

Recommended citation: Matsuda, S. & Yamamoto, J. (2013). Intervention for Increasing the Comprehension of Affective Prosody in Children with Autism Spectrum Disorders. Research in Autism Spectrum Disorders, 7(8), 938-946. https://doi.org/10.1016/j.rasd.2013.04.001
Download Paper