Rethinking Loss Functions for Fact Verification
Yuta Mukobara, Yutaro Shigeto, Masashi Shimbo
Main: Factual Content in NLP Oral Paper
Session 2: Factual Content in NLP (Oral)
Conference Room: Marie Louise 1
Conference Time: March 18, 11:00-12:30 (CET) (Europe/Malta)
TLDR:
You can open the
#paper-364-Oral
channel in a separate window.
Abstract:
We explore loss functions for fact verification in the FEVER shared task. While the cross-entropy loss is a standard objective for training verdict predictors, it fails to capture the heterogeneity among the FEVER verdict classes. In this paper, we develop two task-specific objectives tailored to FEVER. Experimental results confirm that the proposed objective functions outperform the standard cross-entropy. Performance is further improved when these objectives are combined with simple class weighting, which effectively overcomes the imbalance in the training data. The source code is available (https://github.com/yuta-mukobara/RLF-KGAT).