Causal-Based Debiased Reasoning Method for Grounded Textual Entailment
-
Graphical Abstract
-
Abstract
Grounded textual entailment (GTE) requires an agent to distinguish the inference relations between premise and hypothesis sentences based on given context. While significant progress has been made to enhance representation learning by using contextual information. However, current methods overlook spurious correlations between context and input sentences, leading to poor model generalization and robustness. Moreover, existing debiasing techniques fail to fully consider the impact of contextual information on inference processes, resulting in inaccurate identification of spurious correlations. To address these issues, we propose a novel causal-based debiased reasoning method (CBDRM) that integrates causal inference methods while fully considering contextual information. Specifically, we first construct a causal graph through statistical analysis to accurately describe the relationship between different variables among the input data. Then, we calculate the total causal effect of input data on the prediction results by using a biased pre-training model. Additionally, the direct causal effect caused by spurious correlations are calculated by using counterfactual methods. By removing the direct causal effect from the total causal effect, CBDRM achieves unbiased inference relation prediction. Furthermore, we take the impact of context into consideration and design a novel contrastive learning module to improve the unbiased inference performance of CBDRM. Finally, extensive experiments over publicly available datasets demonstrate the superiority and effectiveness of our proposed CBDRM. Moreover, we construct and release an unbiased GTE challenge set to promote the related research.
-
-