LING 575 - Spoken Dialog Systems
Spring 2015
Specialized Topics - Modelling user affect in dialog systems


PrimaryForbes-Riley and Litman, 2004Kate Forbes-Riley and Diane Litman. (2004). Predicting Emotion in Spoken Dialogue from Multiple Knowledge Sources Proceedings of the Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics: HLT-NAACL 2004
SecondaryLee and Narayanan, 2005Chul Min Lee, and Shrikanth S. Narayanan, Toward Detecting Emotions in Spoken Dialogs IEEE TRANSACTIONS ON SPEECH AND AUDIO PROCESSING, VOL. 13, NO. 2, MARCH 2005
SecondaryForbes-Riley and Litman, 2012Kate Forbes-Riley and Diane Litman. (2012) Adapting to Multiple Affective States in Spoken Dialogue in Proceedings of the 13th Annual Meeting of the Special Interest Group on Discourse and Dialogue (SIGDIAL), p. 217-226.
SupplementaryPon-Barry and Shieber, 2011Heather Pon-Barry and Stuart M. Shieber. (2011) Recognizing Uncertainty in Speech. EURASIP Journal on Advances in Signal Processing, 2011(251753). Special Issue on Emotion and Mental State Recognition from Speech.
SupplementaryZeng et al, 2007A Survey of Affect Recognition Methods: Audio, Visual and Spontaneous Expressions Zhihong Zeng, Maja Pantic, Glenn I. Roisman, and Thomas S. Huang, in ICMI 2007
SupplemntaryTraum et al., 2010Traum, D.R., Marsella, S., Gratch, J. Emotion and Dialogue in the MRE Virtual Humans. (2010) In Proceedings of the Tutorial and Research Workshop on Affective Dialogue Systems.