Title: Explainable and Generalized Event Reasoning
Speaker: Xiyan Fu (ICL)
Being able to reason beyond what is explicitly stated in a text forms the basis of deep language understanding (NLU).
Many state-of-the-art models have achieved high accuracy on several reasoning problems (e.g., question answering);
especially some large pre-trained language models have surpassed human performance. However, these widely adopted
strategies come with two major weaknesses: 1) low interpretability and 2) low generalization. We aim to avoid these
restrictions and therefore propose solving reasoning problems with a strategy of decomposition that is inspired by the
learning process of children. Specifically, we decompose meaning representation of events as provided by VerbNet, and
generate a series of questions to model transitive information, that we provide as a scaffold or guidance for the model.
In doing so, the model can explain the inference rules by providing concrete guiding questions, and can be shown to be
more amenable to out-of-distribution scenes by recombining decomposed knowledge.