A lightweight, minimalistic frontend built with React.js for interacting with the Ollama API. Designed with minimal dependencies for simplicity, speed, and easy customization.
- Minimal dependencies - build with React only
- Streamed conversations with Ollama models
- Persistent Converstaion history
- Markdown rendering with syntax highlighting
- Support Two modes: Chat and Completion
- Allow users to define or tweak the system prompt for better control.
- Copy code blocks or entire messsage easily
- Automatic title genreation for conversations
- Exlusive Reasoning Component
- Clean, Reponsive UI
- React.js
- Tailwindcss
- shadcn/ui
Clone the repository and install dependecies:
git clone https://github.com/cushydigit/lumina.git
cd lumina
npm install
npm run dev
or easily use lamina with Docker way: build then docker image
docker build -t lumina .
run the docker container
docker run =p 4173:4173 lumina
Make sure your Ollama server is running locally at (localhost:11434) or update the API URL if needed.
if your Ollama instance is runnig elsewhere, you could easily edit the API_BASE_URL in api.ts file if needed.
const API_BASE_URL = "http://localhost:11434";
This porject is licensed under the MIT License.
pull requiest, suggestions, and feedback are welcome!
-
Delete and Retry Messages
Allow users to delete messages or retry sending failed messages. -
Model Pulling Support
UI for pulling, updating, and managing Ollama models directly. -
Conversation Pinning
Pin important conversations to the top for quick access. -
Search Conversations
Quickly search across conversations by keywords. -
Export / Import Conversations
Allow users to back up and restore chats in JSON or Markdown format.