Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

ByteDance
/
Ouro-2.6B-Thinking

Text Generation
Transformers
Safetensors
ouro
looped-language-model
reasoning
recurrent-depth
thinking
chain-of-thought
conversational
custom_code
Model card Files Files and versions
xet
Community
8
New discussion
Resources
  • PR & discussions documentation
  • Code of Conduct
  • Hub documentation

Fix UniversalTransformerCache.get_mask_sizes for batched generation

1
#8 opened 2 days ago by
KristianS7

Fix bos/eos token IDs + add enable_thinking to chat template

2
#7 opened 3 days ago by
KristianS7

rope_type='default' excluded from ROPE_INIT_FUNCTIONS in transfomers >=5.0

#6 opened 3 days ago by
sirorezka

Added 'pad_token_id'.

#5 opened 3 days ago by
sirorezka

Updated ids for bos_id, eos_id

1
#4 opened 3 days ago by
sirorezka

Upd tokenizer bos_token, eos_token, pad_token

#3 opened 3 days ago by
sirorezka

OuroRotaryEmbedding ROPE_INIT_FUNCTIONS does not contains a 'default' key

#2 opened about 2 months ago by
bitsydarel
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs