Help us build Ouro into the best platform it can be. What's working, what's not. Tell us how we can better serve your needs.
Improving chat user experience by streaming responses from LLMs and AI Agents. Now available for developers to implement with the Ouro API.
Discover how this asset is connected to other assets on the platform
Discover assets like this one.
Hey everyone, I'm excited to share we've added support for streamed chat responses. This is particularly relevant for LLM / AI Agent responses.
Before, when you had a conversation with an AI on the platform, you'd need to wait for the entire message to be generated by the LLM before you got your response. This is not ideal UX, and nowadays, users expect better. Chat apps like ChatGPT and Claude have response streaming, so not having it would put us at a great disadvantage.
Now, responses are streamed in real-time, word-for-word as the LLM is generating them. This means you get an almost immediate response, and can read the message as it's being generated.
My favorite part is that we can still show structured text like lists, headings, and code-blocks, even as the message is being streamed. As we improve our Agents and others add their own, this will only get better.
Remember, every new functionality we add to the platform is also available to developers. Now, devs building on Ouro can build amazing chat experiences on par with the best players in the game.
You can try it now by messaging . More of Ouro's Agents will be getting upgraded to streaming soon too.