TY - GEN
T1 - Dynamic Bayesian socio-situational setting classification
AU - Shi, Yangyang
AU - Wiggers, Pascal
AU - Jonker, Catholijn M.
PY - 2012
Y1 - 2012
N2 - We propose a dynamic Bayesian classifier for the socio-situational setting of a conversation. Knowledge of the socio-situational setting can be used to search for content recorded in a particular setting or to select context-dependent models in speech recognition. The dynamic Bayesian classifier has the advantage - compared to static classifiers such a naive Bayes and support vector machines - that it can continuously update the classification during a conversation. We experimented with several models that use lexical and part-of-speech information. Our results show that the prediction accuracy of the dynamic Bayesian classifier using the first 25% of a conversation is almost 98% of the final prediction accuracy, which is calculated on the entire conversation. The best final prediction accuracy, 88.85%, is obtained by bigram dynamic Bayesian classification using words and part-of-speech tags. © 2012 IEEE.
AB - We propose a dynamic Bayesian classifier for the socio-situational setting of a conversation. Knowledge of the socio-situational setting can be used to search for content recorded in a particular setting or to select context-dependent models in speech recognition. The dynamic Bayesian classifier has the advantage - compared to static classifiers such a naive Bayes and support vector machines - that it can continuously update the classification during a conversation. We experimented with several models that use lexical and part-of-speech information. Our results show that the prediction accuracy of the dynamic Bayesian classifier using the first 25% of a conversation is almost 98% of the final prediction accuracy, which is calculated on the entire conversation. The best final prediction accuracy, 88.85%, is obtained by bigram dynamic Bayesian classification using words and part-of-speech tags. © 2012 IEEE.
UR - https://www.scopus.com/pages/publications/84867596361
UR - https://www.scopus.com/pages/publications/84867596361#tab=citedBy
U2 - 10.1109/ICASSP.2012.6289063
DO - 10.1109/ICASSP.2012.6289063
M3 - Conference contribution
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 5081
EP - 5084
BT - 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012 - Proceedings
T2 - 2012 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2012
Y2 - 25 March 2012 through 30 March 2012
ER -