Nova•E¹

Nova•E¹ is a 22B parameter Large Language Model (LLM) specialized in coding, logical reasoning (thinking), and mathematics. It is a highly capable model designed for local execution, with a focus on English, Persian, and German.

Model Card: DreamhubAI/Nova-E-1


✨ Key Features & Capabilities

  • Core Specializations: Excels at code generation, complex problem-solving, and mathematical reasoning.
  • Advanced Reasoning: Incorporates "thinking" capabilities for step-by-step logic and well-considered answers.
  • Multilingual Support: Optimized for English, Persian (فارسی), and German (Deutsch).
  • Accessible Deployment: Available in two formats for different hardware setups:
    • Full Precision (BF16): The full 22B parameter model for maximum accuracy.
    • 8-bit Quantized: A memory-optimized version for easier local execution.
  • Efficient Fine-Tuning: Compatible with the Unsloth library for fast and memory-efficient training.
  • Permissive License: Released under the Apache 2.0 license.

🚀 Download and Usage

The primary model and its smaller variant are hosted on Hugging Face.

Main Model (22B)

  • Repository (Full/8-bit): DreamhubAI/Nova-E-1
  • Hugging Face Transformers is the main framework for loading and using these models.

Smaller Variant

System Requirements

For GPU Execution

  • A Tesla T4 GPU or a similar card with sufficient VRAM is recommended for good performance.

For CPU Execution

  • This is possible but requires significant system memory (RAM).
  • Minimum Recommended RAM: 32 GB.
  • Recommended for Stable Operation: 64 GB.

Quick Start Example

You can load and use the model with the transformers library. The following is a basic example:

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "DreamhubAI/Nova-E-1"  # or "DreamhubAI/Nova-E-1-8bit"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Generate text
inputs = tokenizer("A programming function to calculate factorial:", return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0]))

🛠️ Projects and Ecosystem

Nova•E User Interface (Web UI)

· A dedicated interface for interacting with the model is available on GitHub. · Repository: DreamhubAI/WALL-E

👥 Community and Contribution

We welcome contributions and feedback from the community.

· Primary GitHub Organization: DreamhubAI - Hosts the main UI and related tools. · Contributor GitHub: unknownsv - For collaborative development. · Contact Email: For project-related inquiries, you can reach out at [email protected].

📜 License

This model is openly licensed under the Apache License 2.0. This allows for broad usage in both commercial and research applications, with minimal restrictions. Please see the LICENSE file in the model repository for full terms.


Thank you for your interest in Nova•E¹! We hope this model serves as a powerful tool for your development and research projects in coding, reasoning, and mathematics.

Downloads last month
54
Safetensors
Model size
22B params
Tensor type
BF16
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for DreamhubAI/Nova-E-1

Base model

openai/gpt-oss-20b
Quantized
(146)
this model
Quantizations
2 models

Collection including DreamhubAI/Nova-E-1