Introducing Prior Knowledge into Machine Translation
Abstract
To develop better machine translation (MT) systems, there are in general two paradigms: 1) exploring more powerful mathematical tools for better modeling; and 2) using more prior knowledge (e.g., syntax) in MT systems to translate more like a person. These approaches can complement each other but conflict in some cases. In this talk, we focus more on how to incorporate prior knowledge into MT. In particular, we present methods to use different types of knowledge in modeling and decoding. Also, we share our experiences in developing these systems and show experimental results on Chinese-English tasks.
SpeakerProf. Tong Xiao | |
Date & Time29 Aug 2016 (Monday) 16:00 - 17:00 | |
VenueE11-1012 (University of Macau) | |
Organized byDepartment of Computer and Information Science |
Biography
Dr. Tong Xiao is an associate professor at the College of Computer Science and Engineering, Northeastern University (CN). He obtained his Ph.D. from Northeastern University (CN) in 2012, and was elected as a candidate of the Excellent Ph.D. Dissertation Award established by CIPS (Chinese Information Processing Society of China). He also visited several research institutes for machine translation research, including Fuji Xerox, Microsoft Research Asia, and the University of Cambridge. He was and is a co-PI of the NiuTrans open source MT project (http://www.niutrans.com/niutrans/NiuTrans.html). By now, Dr. Xiao has published over twenty papers on AI, JAIR, AAAI, ACL, EMNLP and COLING. His current research interests include statistical machine translation and language modeling.