What I Built
I built an AI-powered FAQ assistant that delivers instant, context-aware answers to user questions using Redis 8 and OpenAI. It supports semantic search with Redis Vector similarity and intelligently caches responses using RedisJSON to speed up repeated queries. Users can interact with a friendly chat interface that retrieves and returns answers in real time.
The goal of this project is to demonstrate how Redis can be used as a real-time AI data layer to power conversational interfaces, specifically for use cases like FAQs, support bots, and knowledge assistants.
- Note that the cache duration is set to 15 mins due to the limited storage
Demo
Live App: https://master.d15ripqxac0b7u.amplifyapp.com
Github: https://github.com/dilumdarshana/ai-faq-memory-assistant
How I Used Redis 8
Redis 8 was the core engine that powered both performance and intelligence in this project. Here’s how I used its features:
Redis Stack (RedisJSON + RediSearch + Vector Similarity Search):
I stored FAQ documents in Redis using the JSON data type. Each FAQ includes fields like question, answer, source, and a createdAt timestamp. The documents are indexed for full-text and semantic search using FT.CREATE.
Vector Search for Semantic Understanding:
Each question was embedded into a 1536-dimensional vector using OpenAI's embedding model. These vectors were stored in Redis and indexed using the HNSW algorithm. When a user asks a question, the assistant searches Redis using vector similarity to retrieve the most relevant context.
Intelligent Caching with RedisJSON:
Before sending a user query to the LLM, I check if a cached answer already exists for a similar query using Redis. This reduces token usage and cost while improving response time.
FT.SEARCH Sorting by Date:
For the "Recently Asked Questions" section, I used FT.SEARCH with sorting on the createdAt field to show the latest activity. (15 mins cache duration)
Redis not only made the solution real-time and scalable, but it also gave me a flexible and powerful way to combine structured, vector, and text search in one place.
Design:
Final Thoughts
This project shows how Redis 8 can be a game-changer for real-time AI applications. Whether you're building FAQ bots, search interfaces, or RAG-based assistants, Redis offers a powerful and developer-friendly solution.
Tech Stack:
- Redis 8 (Vector Store, RediSearch, RedisJSON)
- OpenAI (gpt-4o, embeddings)
- Next.js (LangChain, TypeScript)
- pnpm
- AWS Amplify
Top comments (0)