TY - GEN
T1 - Compositional Language Modeling for Icon-Based Augmentative and Alternative Communication
AU - Dudy, Shiran
AU - Bedrick, Steven
N1 - Funding Information:
We thank the anonymous DeepLo workshop reviewers for their valuable feedback and comments. We also thank our clinical collaborators in OHSU's Institute on Development & Disability: Betts Peters, Brandon Eddy, and Dr. Melanie Fried-Oken, as well as our collaborators at Northeastern University, in the laboratories of Drs. David Smith and Deniz Erdogmus. Research reported in this paper was supported by the National Institute on Deafness and Other Communication Disorders of the NIH under awards R01DC009834 and R56DC015999.
Funding Information:
We thank the anonymous DeepLo workshop reviewers for their valuable feedback and comments. We also thank our clinical collaborators in OHSU’s Institute on Development & Disability: Betts Peters, Brandon Eddy, and Dr. Melanie Fried-Oken, as well as our collaborators at North-eastern University, in the laboratories of Drs. David Smith and Deniz Erdogmus. Research reported in this paper was supported by the National Institute on Deafness and Other Communication Disorders of the NIH under awards R01DC009834 and R56DC015999.
Publisher Copyright:
© 2018 Association for Computational Linguistics
PY - 2018
Y1 - 2018
N2 - Icon-based communication systems are widely used in the field of Augmentative and Alternative Communication. Typically, icon-based systems have lagged behind word- and character-based systems in terms of predictive typing functionality, due to the challenges inherent to training icon-based language models. We propose a method for synthesizing training data for use in icon-based language models, and explore two different modeling strategies.
AB - Icon-based communication systems are widely used in the field of Augmentative and Alternative Communication. Typically, icon-based systems have lagged behind word- and character-based systems in terms of predictive typing functionality, due to the challenges inherent to training icon-based language models. We propose a method for synthesizing training data for use in icon-based language models, and explore two different modeling strategies.
UR - http://www.scopus.com/inward/record.url?scp=85122012868&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85122012868&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85122012868
T3 - Proceedings of the Annual Meeting of the Association for Computational Linguistics
SP - 25
EP - 32
BT - ACL 2018 - Deep Learning Approaches for Low-Resource Natural Language Processing, DeepLo 2018 - Proceedings of the Workshop
PB - Association for Computational Linguistics (ACL)
T2 - ACL 2018 Workshop on Deep Learning Approaches for Low-Resource Natural Language Porcessing, DeepLo 2018
Y2 - 19 July 2018
ER -