GLiNER2 LoRA Adapter
This is a LoRA adapter trained on top of fastino/gliner2-large-v1.
Usage
from gliner2 import GLiNER2
# Load base model
model = GLiNER2.from_pretrained("fastino/gliner2-large-v1")
# Load LoRA adapter
model.load_adapter("CHFLTM/gliner2-lora-custom")
# Use the model
text = "Your text here"
labels = ["person", "organization", "location"]
entities = model.predict_entities(text, labels, threshold=0.5)
print(entities)
Training
This adapter was trained using LoRA (Low-Rank Adaptation) with the following configuration:
- Base model: fastino/gliner2-large-v1
- LoRA rank: 16
- LoRA alpha: 32
- LoRA dropout: 0.1
- Target modules: encoder
Model Details
- Developed by: [Your Name/Organization]
- Model type: Named Entity Recognition (NER)
- Language: [Your language]
- License: [Your license]
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for CHFLTM/gliner2-lora-custom
Base model
fastino/gliner2-large-v1