- This event has passed.
Colloquium: Yohei Oseki
April 29 @ 3:00 pm
Building machines that process natural language like human
Despite the close alliance in the 1980s, theoretical linguistics (a branch of cognitive science) and natural language processing (a branch of artificial intelligence) have traditionally been divorced, especially since the recent advent of deep learning. Theoretical linguistics proposed computational theories to represent linguistic competence through symbolic formal grammars, whereas natural language processing developed algorithmic models to approximate linguistic performance through artificial neural networks without symbolic structures. However, given that those computational and algorithmic perspectives are not mutually exclusive, one promising approach to model complex information processing systems like natural language would be to reverse-engineer human language processing. In this talk, we review computational models of language processing with special focus on syntactic and morphological processing, where symbolic formal grammars and artificial neural networks are constructed and evaluated against human language processing via information-theoretic complexity metrics. The results converge on the conclusion that symbolic structures and neural networks must be integrated towards “human-like” language processing, suggesting that theoretical linguistics and natural language processing should be married again in order to build machines that process natural language like humans.