Zero-Shot Cross-Lingual Discrete Reasoning

img
Discrete reasoning, including addition, subtraction, counting, sorting etc., remains a challenging task of machine reading comprehension. In addition, lack of parallel MRC data in languages other than English leads to increasing research interest on cross-lingual transfer learning. In light of studies from both sides, we tackle the task of zero-shot cross-lingual discrete reasoning using DROP data set and their manual translations in German and Chinese languages, and show that 1) multilingual BERT model can be configured to solve discrete reasoning tasks, and 2) the knowledge of discrete reasoning can be transferred cross-lingually in German and Chinese languages to certain extent, even without any available parallel training data.