Title: A Grammatical Approach to Three Problems in NLP Abstract: Grammar formalisms are powerful tools for expressing hierarchical structure in language and other domains. This approach to modeling is one of the most important workhorses we have in natural language processing, and one might say that the tight relationship between grammars and natural language processing is parallel to that between graphical models and machine learning. Latent variable modeling, on the other hand, is a powerful technique in statistics used to improve the expressivity of statistical models. When combined with grammar formalisms, they can capture hierachical information which is hidden in the data. In this talk, I will describe new algorithms and applications for use with latent-variable grammars. I will show how latent-variable grammars can be estimated without using the EM algorithm, which is sensitive to initialisation. I will also demonstrate how these grammars can be used for applications such as syntactic parsing, machine translation, language modeling and discourse modeling in social media. Bio: Shay Cohen is a Chancellor's fellow lecturer at the University of Edinburgh (School of Informatics). Before that, he was a postdoctoral research scientist in the Department of Computer Science at Columbia University, and held an NSF/CRA Computing Innovation Fellowship. He received his B.Sc. and M.Sc. from Tel Aviv University in 2000 and 2004, and his Ph.D. from Carnegie Mellon University in 2011. His research interests span a range of topics in natural language processing and machine learning, with a focus on structured prediction. He is especially interested in developing efficient and scalable parsing algorithms as well as learning algorithms for probabilistic grammars.