What’s broken?
One of our power users gave a legit feedback:
"Avoid simulating streaming; it is too time-consuming for large texts."
When the AI reply is very long (generation or select) the current UI where you can see the AI cursor typing text in the editor is not fit as it takes too long to get the result.
Sure you can adapt a the delay but maybe it won't be confortable for the users.
What did you expect to happen?
When the AI has to update / generate large chunk of text it could be nice to output it all at once or by large chunks.
Steps to reproduce
I've done a bit of benchmarking betwen Notion and BlockNote AI to show the difference between the approches on several use case :
- translate a large text in english into french
- make a large text shorter
- continue writing
Always using the same text sample
https://fichiers.numerique.gouv.fr/explorer/items/2dadfbd6-170b-45da-ab42-d15832e9f978
BlockNote version
No response
Environment
No response
Additional context
No response
Contribution
Sponsor
What’s broken?
One of our power users gave a legit feedback:
When the AI reply is very long (generation or select) the current UI where you can see the AI cursor typing text in the editor is not fit as it takes too long to get the result.
Sure you can adapt a the delay but maybe it won't be confortable for the users.
What did you expect to happen?
When the AI has to update / generate large chunk of text it could be nice to output it all at once or by large chunks.
Steps to reproduce
I've done a bit of benchmarking betwen Notion and BlockNote AI to show the difference between the approches on several use case :
Always using the same text sample
https://fichiers.numerique.gouv.fr/explorer/items/2dadfbd6-170b-45da-ab42-d15832e9f978
BlockNote version
No response
Environment
No response
Additional context
No response
Contribution
Sponsor