Description
Describe the bug
When implementing a custom AI proxy for BlockNote using a Kotlin/Ktor backend that connects to the OpenAI API, the editor's AI functionality fails, showing the error: Error calling LLM Error: No operations seen
.
This occurs when BlockNote communicates with the custom AI proxy endpoint. The proxy server correctly implements an OpenAI-compatible streaming response, returning Server-Sent Events (SSE) with a text/event-stream
content type.
The error suggests that while the connection is successful, BlockNote's client-side module does not recognize the response format. The core issue seems to be a mismatch between the standard OpenAI stream format and the specific "operation" format that BlockNote expects to receive.
To Reproduce
- Set up a backend server (e.g., using Kotlin/Ktor) to act as an AI proxy.
- Configure the server to forward requests to the OpenAI API and stream the response back.
- Implement the BlockNote editor in a frontend application.
- Configure the BlockNote AI client to use the custom proxy URL
- Attempt to use any AI feature in the BlockNote editor (e.g., text completion).
- Observe the error
Error calling LLM Error: No operations seen
in the browser's developer console.
Misc
- Node version: v23.6.1
- Package manager: npm
- Browser: Version 137.0.7151.69 (Official Build) (arm64)