Model Card for mofaus/mofaus-lingua-mt5-small-v1

This model is a fine-tuned version of google/mt5-small specifically for translating between Hausa and English. It was trained on the mofaus/hausa-english-v1 dataset.

(Model card updated on Oct 24, 2025 to trigger API refresh)

Model Details

Model Description

This is a transformers model fine-tuned using the google/mt5-small checkpoint on a Hausa-English parallel corpus. It is intended for translation tasks between these two languages.

  • Developed by: mofaus
  • Funded by [optional]: [More Information Needed]
  • Shared by [optional]: [More Information Needed]
  • Model type: mt5 (Encoder-Decoder)
  • Language(s) (NLP): Hausa (ha), English (en)
  • License: apache-2.0 (Inherited from mt5-small)
  • Finetuned from model: google/mt5-small

Model Sources [optional]

Uses

Direct Use

This model is intended for direct use in translating text between Hausa and English. Prefix the input text with translate Hausa to English: or translate English to Hausa: as appropriate.

# Example Usage (requires transformers library)
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("mofaus/mofaus-lingua-mt5-small-v1")
model = AutoModelForSeq2SeqLM.from_pretrained("mofaus/mofaus-lingua-mt5-small-v1")

# Hausa to English
input_text_ha = "translate Hausa to English: Yaya kake?"
inputs_ha = tokenizer(input_text_ha, return_tensors="pt")
outputs_ha = model.generate(**inputs_ha)
print(tokenizer.decode(outputs_ha[0], skip_special_tokens=True))

# English to Hausa
input_text_en = "translate English to Hausa: Good morning"
inputs_en = tokenizer(input_text_en, return_tensors="pt")
outputs_en = model.generate(**inputs_en)
print(tokenizer.decode(outputs_en[0], skip_special_tokens=True))
Downloads last month
8
Safetensors
Model size
0.3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support