Welcome to the GenerativeAI repository! Here you'll find a collection of practical Generative AI applications built using the Ollama framework. Ollama enables local experimentation with large language models, making it easy to prototype and deploy AI-powered solutions.
Each application is maintained in a dedicated branch, allowing you to explore various Generative AI use cases independently. The main categories include:
- Chatbots: Conversational agents utilizing large language models.
- Retrieval-Augmented Generation (RAG): Applications that combine external data sources (such as web content, PDFs, or images) with language models for context-aware outputs.
- CNN (Convolutional Neural Network) Image-Based Model: Example of training and deploying a CNN for image processing tasks.
Currently, the repository includes the following main applications:
-
Web Search-Based RAG Pipeline with Chat Application:
Handles user queries by incorporating web search results into a Retrieval-Augmented Generation pipeline. -
RAG Application Using PDF as Knowledge Source:
Enables chat-based interactions using information extracted from PDF documents via a RAG pipeline. -
RAG Application with Redis and Image Parsing:
Utilizes Redis for fast data retrieval and supports image-to-text parsing to enhance chatbot responses in the RAG pipeline. -
End-to-End MCP Client-Server Chat Application:
Demonstrates a complete client-server chat architecture, showcasing AI model integration in distributed systems. -
End-to-End A2A (Agent2Agent) Client-Server Application:
Presents a full A2A (Agent2Agent) client-server setup, highlighting AI model integration over the A2A protocol. -
Build and Deploy a CNN (Convolutional Neural Network) with FastAPI:
Provides a complete example of training a CNN model and deploying it using a FastAPI application. -
Copilot Studio Direct Line API based React Native application:
This project demonstrates how to integrate a React Native application with a Copilot Studio agent using the Direct Line API.
- Docker (for running Ollama models locally)
- Ollama (refer to the official documentation for installation and supported models)
- Python 3.10+ (required for most applications)
- Additional dependencies listed in each branch's README
- LICENSE: Repository distribution terms.
- SECURITY: Instructions for reporting vulnerabilities and security concerns.
For detailed documentation, setup guides, and advanced usage, see the README in each application branch.
To learn more about Ollama and its capabilities, visit the official Ollama website.
Contributions and feedback are encouraged! Please open issues or submit pull requests to help improve this repository.