A4: Language Comprehension in a Noisy Channel to Changing Situations and Individual Users
The central goal of project A4 is to examine how noise (or the effect of reduced hearing ability) will influence language comprehension, and how natural language generation systems can adapt their output to minimize the risk of misunderstanding. The experimental part of the project investigates neurophysiological correlates of bottom-up perceptual level and top-down predictive language processing, and how these functions interact when noise is added to the signal. In the modelling part, we propose a noisy channel model, consisting of a component that models comprehension at different levels of hearing ability (based on insights from the experimental part of the project), and a generation component that optimizes the system-generated output in order to minimize the risk of misunderstanding, while also adapting the output to a target channel capacity.
How speakers adapt object descriptions to listeners under load Journal Article
Language, Cognition and Neuroscience, pp. 1-15, 2019.
Neuropsychologia, 131 , pp. 53-61, 2019.
Frontiers in Psychology-Cognition, 10 | Article 709 , 2019.
OCR Post-Correction of the Royal Society Corpus Based on the Noisy Channel Model Inproceedings
41. Jahrestagung der Deutschen Gesellschaft für Sprachwissenschaft (DGfS) Bremen, 2019.
Toward Bayesian Synchronous Tree Substitution Grammars for Sentence Planning Inproceedings
11th International Conference on Natural Language Generation (INLG2018), Tilburg, The Netherlands, 2018.
Improving Variational Encoder-Decoders in Dialogue Generation Inproceedings
32nd AAAI Conference on Artificial Intelligence (AAAI-18), New Orleans, USA, 2018.
Surprisal modulates dual-task performance in older adults: Pupillometry shows age-related trade-offs in task performance and time-course of language processing Journal Article
Psychology and Aging, 33 (8), pp. 1168-1180, 2018.
Frontiers in Psychololgy, 9 , pp. 2276, 2018.
Proceedings of EACL 2017, Valencia, 2017.
Proc. of the 10th International Natural Language Generation Conference (INLG), pp. 149-153, Association for Computational Linguistics, Santiago de Compostela, Spain, 2017.
Proc. Interspeech 2017, pp. 3757-3761, 2017.
Age-differences in recovery from prediction error: Evidence from a simulated driving and combined sentence verification task. Inproceedings
39th Annual Meeting of the Cognitive Science Society, 2017.
Referential overspecification in response to the listener's cognitive load Inproceedings
International Cognitive Linguistics Conference, Tarttu, Estonia, 2017.
Calzolari, Nicoletta; Matsumoto, Yuji; Prasad, Rashmi (Ed.): COLING 2016, 26th International Conference on Computational Linguistics, Proceedings of the Conference: Technical Papers, pp. 1524-1534, ACL, Osaka, 2016, ISBN: 978-4-87974-702-0.
Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 546-551, Association for Computational Linguistics, San Diego, California, 2016.
Frontiers in Psychology, 7 (844), 2016, ISSN: 1664-1078.
Proceedings of the 15th European Workshop on Natural Language Generation (ENLG), pp. 105-108, Association for Computational Linguistics Brighton, 2015.
Talk presented at Formal and Experimental Pragmatics: Methodological Issues of a Nascent Liaison (MXPRAG), Zentrum für Allgemeine Sprachwissenschaft (ZAS), Berlin, June 2015, Berlin, 2015.
Discourse Expectations: Theoretical, Experimental, and Computational Perspectives (DETEC) (poster), University of Alberta, Edmonton, June 2015, 2015.
Workshop on Individual differences in language processing across the adult life span, 2015.
Proc. of the 1st International Workshop on Data-to-Text Generation, Edinburgh, Scotland, UK, 2015.