Title: Reducing Grounded Learning Tasks To Grammatical Inference Benjamin Boerschinger (with Bevan K. Jones and Mark Johnson) Abstract: It is often assumed that grounded learning tasks are beyond the scope of grammatical inference techniques. We show that the grounded task of learning a semantic parser from ambiguous training data can be reduced to a Probabilistic Context-Free Grammar learning task in a way that gives state of the art results. We further show that additionally letting our model learn the languages canonical word order improves its performance. While our model is surprisingly simple, it has certain obvious shortcomings which we will point out, and we will sketch how we plan to address these problems in future research.