This video might look simple — just printing a message — but guess what?
That message is coming straight from my custom tool, bound into my very own AI assistant backend.
It’s a basic tool (just echoing input) — but that’s the point.
We’re testing the foundation.
And just like that, a new journey begins. 🚀
Right now, we’ve set up:
- ✅ Local LLMs via Ollama
- ✅ Flow + tool binding with LangChain
- ✅ Embeddings + search using PGVector
- ✅ Session memory plan
- ✅ Streaming UI — fully working and shown in the video
- ✅ Tools bound and functional (even this echo)
We’re now diving into the MCP server — exploring advanced tool orchestration and how to scale across multiple servers.
But let’s be clear:
👉 What we’ve done so far is just the beginning.
We're still in a small zone of a big vision — and there’s a LOT left to build.
🎯 Blog series coming soon.
Maybe videos too — though I want to focus on building first.
And maybe… just maybe…
We can turn this into a bootcamp-style learning group or live workshop where we explore, test, and learn together.
I watched tons of tutorials, read docs, debugged endlessly… but never found a complete, JS-focused guide that connects everything together — or maybe I just didn’t find the one that worked for me. So, I’m making one.
But what’s more important is how we’re building it.
We’re doing everything from scratch — manually configuring each part.
Why? Because we want to understand the core.
There are definitely easier ways. We could’ve used pre-built SDKs, hosted platforms, or plug-and-play services.
But once we truly understand how everything connects — from embeddings to vector search to tool invocation — we’ll have the power to use any provider, or even build our own.
We’re not just learning tools.
We’re learning how to build our own AI brains — with control, understanding, and creativity.
Whether you're:
- A beginner? Let’s cook together.
- Already familiar with some of these tools? Drop advice! I’m listening.
- Confused or stuck? Comment your question — maybe someone here can help you, or I’ll try!
Top comments (0)