For an app that I'm working on[1], I was using LangChain's Ollama integration and it was a headache. Things wouldn't work as documented as they were either wrong or outdated. Then when I had enough, I replaced it with the official Ollama JS library and so far it has been a great experience. It just works without any ceremony.
Your app looks great! Like a more user-friendly LM Studio. LM Studio is more focused on the models and technical details of those where yours looks more like workspace-driven UX for actually using different models and retaining chats, etc. Nice work!
Question if you don't mind: How do you make the Ollama server opaque to the end user? Is it just installed during your app's installation, and your app manages the Ollama process as needed?
There's an official Ollama JavaScript library for the non-python AI folks! Node people - now is your chance to experiment with local open models from the comfort of a client library :)
[1]: https://msty.app