Skip to content

fix: avoid repeated T5 EOS tokens in Anima prompt weights#1501

Merged
leejet merged 1 commit into
masterfrom
fix-anima-t5-weight-tokenization
May 16, 2026
Merged

fix: avoid repeated T5 EOS tokens in Anima prompt weights#1501
leejet merged 1 commit into
masterfrom
fix-anima-t5-weight-tokenization

Conversation

@leejet
Copy link
Copy Markdown
Owner

@leejet leejet commented May 16, 2026

Summary

Fixes Anima prompt weighting when prompts contain parenthesized weight syntax.

Anima tokenization split weighted prompts into multiple attention segments, then called T5 tokenization with padding
enabled for each segment. That inserted a T5 EOS token after every segment. The resulting t5_ids sequence was then
passed into the Anima LLM adapter, causing poor/broken generations for weighted prompts.

This changes Anima T5 token handling to:

  • encode each weighted segment without adding special tokens
  • add special tokens once after all segments are concatenated

Related Issue / Discussion

This addresses the broken weighted prompt behavior reported in #1492.

Additional Information

.\bin\Release\sd-cli.exe --diffusion-model  ..\..\ComfyUI\models\diffusion_models\anima-preview.safetensors --vae ..\..\ComfyUI\models\vae\qwen_image_vae.safetensors  --llm ..\..\ComfyUI\models\text_encoders\qwen_3_06b_base.safetensors  -p "a (lovely) cat holding a sign says 'anima.cpp'" --cfg-scale 6.0 --sampling-method euler -v --offload-to-cpu

before:

output

after:

output

Checklist

@leejet leejet merged commit d7ecbe1 into master May 16, 2026
14 checks passed
@leejet leejet deleted the fix-anima-t5-weight-tokenization branch May 16, 2026 13:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant