Title: Better Natural Language Analysis and Amortized Integer Linear Programming Abstract: Computational approaches to problems in Natural Language Understanding and Information Extraction are often modeled as structured predictions that involve assigning values to sets of interdependent variables. Over the last few years, one of the most successful approaches to studying these problems involves Constrained Conditional Models (CCMs), an Integer Learning Programming formulation that augments probabilistic models with declarative constraints as a way to support such decisions. I will present research within this framework, focusing on learning and inference issues and, in particular, will present recent results on Extending Semantic Role Labeling to additional predicates and on Amortized Inference learning to accelerate inference during the life time of the learning system. If there is time, I may mention some work on Wikification and coreference resolution done within this computational framework.