Climate Fallacy Detector (DeBERTa-v2-xlarge)

A fine-tuned DeBERTa-v2-xlarge model for detecting logical fallacies in climate misinformation.

GitHub Hugging Face

This model is a fine-tuned version of microsoft/deberta-v2-xlarge on the FLICC taxonomy dataset. It detects logical fallacies in climate misinformation claims (e.g., Ad Hominem, False Equivalence, Fake Experts).

It successfully validates the results of the paper "Detecting Fallacies in Climate Misinformation: A Technocognitive Approach" (Zanartu et al., 2024), achieving comparable performance on consumer hardware (Mac M-Series).

Performance

  • Test F1 Score (weighted)/ (macro): 0.69 / 0.68
  • Validation Accuracy: 0.72
  • Test Precision: 0.73
  • Test Recall: 0.69

How to Use

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

model_name = "Gaanaman/deberta-v2-xlarge-climate-fallacy"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

text = "The climate has changed before naturally, so humans aren't causing it now."

inputs = tokenizer(text, return_tensors="pt")
with torch.no_grad():
    logits = model(**inputs).logits

predicted_class_id = logits.argmax().item()
print(model.config.id2label[predicted_class_id])
# Output: Slothful Induction (or similar)
Downloads last month
77
Safetensors
Model size
0.9B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 1 Ask for provider support

Model tree for Gaanaman/deberta-v2-xlarge-climate-fallacy

Finetuned
(7)
this model

Dataset used to train Gaanaman/deberta-v2-xlarge-climate-fallacy

Evaluation results