Fix UniversalTransformerCache.get_mask_sizes for batched generation
1
#8 opened 2 days ago
by
KristianS7
Fix bos/eos token IDs + add enable_thinking to chat template
2
#7 opened 3 days ago
by
KristianS7
rope_type='default' excluded from ROPE_INIT_FUNCTIONS in transfomers >=5.0
#6 opened 3 days ago
by
sirorezka
Added 'pad_token_id'.
#5 opened 3 days ago
by
sirorezka
Updated ids for bos_id, eos_id
1
#4 opened 3 days ago
by
sirorezka
Upd tokenizer bos_token, eos_token, pad_token
#3 opened 3 days ago
by
sirorezka
OuroRotaryEmbedding ROPE_INIT_FUNCTIONS does not contains a 'default' key
#2 opened about 2 months ago
by
bitsydarel