I’ve been building MinimalChat for a while now, and based on the feedback I’ve received, it’s in a pretty decent place for general use. I figured I’d share it here for anyone who might be interested!
Quick Features Overview:
- Mobile PWA Support: Install the site like a normal app on any device.
- Any OpenAI formatted API support: Works with LM Studio, OpenRouter, etc.
- Local Storage: All data is stored locally in the browser with minimal setup. Just enter a port and go in Docker.
- Experimental Conversational Mode (GPT Models for now)
- Basic File Upload and Storage Support: Files are stored locally in the browser.
- Vision Support with Maintained Context
- Regen/Edit Previous User Messages
- Swap Models Anytime: Maintain conversational context while switching models.
- Set/Save System Prompts: Set the system prompt. Prompts will also be saved to a list so they can be switched between easily.
The idea is to make it essentially foolproof to deploy or set up while being generally full-featured and aesthetically pleasing. No additional databases or servers are needed, everything is contained and managed inside the web app itself locally.
It’s another chat client in a sea of clients but it is unique in its own ways in my opinion. Enjoy! Feedback is always appreciated!
Self Hosting Wiki Section https://github.com/fingerthief/minimal-chat/wiki/Self-Hosting-With-Docker
I wrote this https://github.com/muntedcrocodile/Sydney
Been having decent success with it but gotta implement proper document embedding.