Scientific Table Fact Checking

Abstract

Automated scientific fact-checking is essential for validating the accuracy of scientific claims, ensuring the integrity of information, and fostering public trust in research findings. However, existing benchmarks for scientific fact-checking suffer from limitations, such as crowd-sourced claims and a lack of quantitative experimental data representation. To address these gaps, we introduce a novel dataset SCITAB, comprising 1.2K challenging scientific claims accompanied by original scientific tables. These claims demand compositional reasoning for verification, making them more representative of real-world scientific fact-checking needs.

Project Members and Collaborators

Publications

SCITAB: A Challenging Benchmark for Compositional Reasoning and Claim Verification on Scientific Tables (Preprint’23)