Transformers How to use hongyin/informer-0.3b-80k with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("hongyin/informer-0.3b-80k")
model = AutoModelForCausalLM.from_pretrained("hongyin/informer-0.3b-80k")