ModernBERT-base-critical-question-binary-classifier

This model is a fine-tuned version of [answerdotai/ModernBERT-base] for binary classification of critical questions about an argument (intervention).

It takes an intervention (the argument) and a critical question about that intervention, and predicts whether the question is Useful or Non-Useful.

This binary version collapses the original 3 labels

  • Useful
  • Unhelpful
  • Invalid

into

  • Useful (1)
  • Non-Useful (0) โ† this is both Unhelpful and Invalid

The model was trained on long inputs, so the format and truncation settings matter.


Usage

from transformers import pipeline

model_id = "MidhunKanadan/ModernBERT-base-critical-question-binary-classifier"

classifier = pipeline(
    "text-classification",
    model=model_id,
    truncation=True,
    max_length=2048,
    top_k=None
)

intervention = "CLINTON: \"The central question in this election is really what kind of country we want to be and what kind of future we 'll build together\nToday is my granddaughter 's second birthday\nI think about this a lot\nwe have to build an economy that works for everyone , not just those at the top\nwe need new jobs , good jobs , with rising incomes\nI want us to invest in you\nI want us to invest in your future\njobs in infrastructure , in advanced manufacturing , innovation and technology , clean , renewable energy , and small business\nmost of the new jobs will come from small business\nWe also have to make the economy fairer\nThat starts with raising the national minimum wage and also guarantee , finally , equal pay for women 's work\nI also want to see more companies do profit-sharing\"",
        
critical_question = "Could Clinton investing in you have consequences that we should take into account? Is it practically possible?"

text = f"Intervention: {intervention} [SEP] Critical Question: {critical_question}"

results = classifier(text)[0]
for r in results:
    print(f"{r['label']}: {r['score']:.4f}")
Downloads last month
6
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support