Yoon Kim (Harvard) - Deep Learning and Language Structure
Deep Learning and Language Structure
Natural language has inherent structure. Words compose with one another to form hierarchical structures to convey meaning. These compositional structures are ubiquitous at all levels of language. Despite the recent, enormous success of deep neural networks in NLP, capturing such discrete, combinatorial structure remains challenging. In this talk, I will present two directions towards an integration of deep learning and language structure. First, we will see how language structure can be used as a rich source of prior knowledge to improve language modeling and representation learning. Second, we will explore how advances in model parameterization and inference, in particular deep learning, can be used as a computational tool to discover linguistic structure from raw text.
If you are affiliated with UChicago CS and would like to attend this talk remotely, contact rmitchum@uchicago.edu for links.
Host: Michael Maire
Yoon Kim
Yoon Kim is a fifth-year PhD student at Harvard University, advised by Alexander Rush. His research is at the intersection of natural language processing and machine learning. He is the recipient of a Google AI PhD Fellowship.