Python chatbot project with:
- FastAPI/LangServe backend
- Gradio frontend
- External LLM API integration (bring your own API key)
docker build -f composer/Dockerfile -t chatbot-app .docker run --rm -p 7860:7860 -e NVIDIA_NIM_API_KEY=your_key chatbot-appThen open:
NVIDIA_NIM_API_KEY is supported and automatically mapped to NVIDIA_API_KEY inside the container.
NVIDIA_NIM_API_KEY=your_key docker compose -f composer/docker-compose.yml up --buildCreate .env in the project root:
NVIDIA_NIM_API_KEY=your_keyThen run:
docker compose -f composer/docker-compose.yml up --buildOpen:
- Frontend runs on port
7860(container and host mapping). - Backend runs internally on port
9012. - Logs stream to your terminal (
docker logs/docker compose logs -f). - Stop cleanly with
Ctrl+C(Compose) ordocker stop <container_id>.
config_LangGraph/README.mdPyKGML/README.mdchatbot/README_docker.md(prebuilt image packaging and end-user run guide)
This project development was supported by Prof. Licheng Liu (University of Wisconsin, Madison; ECAI Lab), Prof. David Mulla (The University of Minnesota, Twin Cities; AI-LEAF Institute), and Prof. Ce Yang (The University of Minnesota, Twin Cities; Agricultural Robotics Lab).