Predicting the present using the future and other such crazy ideas
In this talk we explore two problems, namely, generating a sequence that takes both the past and the future tokens into account, and predicting numerical attributes in a knowledge graph. In the first problem we use a bidirectional self-attention along with special placeholder tokens for inferring the present token from the future tokens that are yet to be generated. We verify the effectiveness of our approach experimentally on two conversational tasks where the proposed model outperforms competitive baselines by a large margin.In the second problem we explore whether it is possible to predict numerical information associated with entities in a knowledge graph using the relational structure of the knowledge graph. A knowledge graph consists of entities such as people, places and other named objects. Entities such as people are associated with birth dates, age, height etc, while places with latitude and longitude, population etc. We aim to predict the numerical data by formulating a regression problem using knowledge graph embeddings.