
Annotation-Inspired Implicit Discourse Relation Classification with Auxiliary Discourse Connective Generation
Abstract
Implicit discourse relation classification is a challenging task due to the absence of discourse connectives. To overcome this issue, we design an end-to-end neural model to explicitly generate discourse connectives for the task, inspired by the annotation process of PDTB. Specifically, our model jointly learns to generate discourse connectives between arguments and predict discourse relations based on arguments and the generated connectives. To facilitate the training while alleviating the discrepancy between training and inference, Scheduled Sampling is introduced to the joint learning. We evaluate our method on two benchmarks, PDTB 2.0 (English) and PCC (German). Results show that our joint model outperforms various strong baselines.