How to use ai2lumos/lumos_unified_ground_iterative with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="ai2lumos/lumos_unified_ground_iterative")
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("ai2lumos/lumos_unified_ground_iterative") model = AutoModelForCausalLM.from_pretrained("ai2lumos/lumos_unified_ground_iterative")
96506d3 deb80ea 96506d3
1
2
3
4
5
6
7
8
{ "_from_model_config": true, "bos_token_id": 1, "eos_token_id": 2, "pad_token_id": 0, "transformers_version": "4.37.2" }