Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

gmongaras
/
Softmax_Attention_GPT_300M

Feature Extraction
PyTorch
gptj
Model card Files Files and versions
xet
Community
1

This repository contains the GPT-Soft-300m model described in Cottention: Linear Transformers With Cosine Attention.

Downloads last month
3
Inference Providers NEW
Feature Extraction
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including gmongaras/Softmax_Attention_GPT_300M

Cosine Attention (Cottention)

Collection
Models for the paper Cottention: Linear Transformers With Cosine Attention https://arxiv.org/abs/2409.18747 • 6 items • Updated Oct 7, 2024

Paper for gmongaras/Softmax_Attention_GPT_300M

Cottention: Linear Transformers With Cosine Attention

Paper • 2409.18747 • Published Sep 27, 2024 • 16
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs