DEV Community

Cover image for How to Connect Phoenix LiveView to Python Machine Learning Models for Real-Time AI Features
HexShift
HexShift

Posted on

How to Connect Phoenix LiveView to Python Machine Learning Models for Real-Time AI Features

Modern users expect more than buttons and forms β€” they want intelligence:

  • πŸ” Smart search
  • ✍️ AI writing assistants
  • πŸ“Š Predictive dashboards
  • 🎯 Personalized experiences

With Phoenix LiveView + Python, you can deliver exactly that β€” blending real-time UI with machine learning intelligence.


Why Python?

Python is the go-to language for AI/ML thanks to:

  • TensorFlow, PyTorch – deep learning
  • spaCy, NLTK – NLP
  • scikit-learn – classical models
  • FastAPI, Flask – fast API serving

You don’t need to rewrite your models in Elixir.

Just wrap them in an API.


Step 1: Wrap Your Model in FastAPI

Here’s a simple Python ML service:

from fastapi import FastAPI
from pydantic import BaseModel
import my_model  # your ML model

app = FastAPI()

class Input(BaseModel):
    text: str

@app.post("/predict")
def predict(input: Input):
    result = my_model.predict(input.text)
    return {"sentiment": result.label, "confidence": result.score}
Enter fullscreen mode Exit fullscreen mode

Run it on localhost:8000 or Docker.


Step 2: Call from LiveView with Finch

In your LiveView:

def handle_event("analyze", %{"text" => input}, socket) do
  Task.async(fn ->
    body = Jason.encode!(%{text: input})
    {:ok, response} = Finch.build(:post, "http://localhost:8000/predict", [], body)
                       |> Finch.request(MyApp.Finch)
    Jason.decode!(response.body)
  end)
  |> then(&{:noreply, assign(socket, predicting: true, prediction_task: &1)})
end

def handle_info({ref, %{"sentiment" => s, "confidence" => c}}, socket) do
  Process.demonitor(ref, [:flush])
  {:noreply, assign(socket, predicting: false, sentiment: s, confidence: c)}
end
Enter fullscreen mode Exit fullscreen mode

Show a spinner with:

<%= if @predicting do %>
  <div class="animate-spin text-gray-500">Analyzing...</div>
<% end %>
Enter fullscreen mode Exit fullscreen mode

Step 3: Use phx-change for Live Input

<form phx-change="analyze" phx-debounce="500">
  <textarea name="text" class="w-full p-2 border"></textarea>
</form>

<%= if @sentiment do %>
  <p class="mt-2 text-sm">
    Sentiment: <strong><%= @sentiment %></strong>  
    (Confidence: <%= (@confidence * 100) |> round %>%)
  </p>
<% end %>
Enter fullscreen mode Exit fullscreen mode

The result updates live as the user types β€” no reload, no JS.


Use Case: AI Writing Assistant

Let users start a sentence.

Your model suggests the next paragraph:

def handle_event("continue", %{"text" => text}, socket) do
  Task.async(fn -> MyApp.AI.generate(text) end)
  {:noreply, assign(socket, loading: true)}
end
Enter fullscreen mode Exit fullscreen mode

And render the result:

<%= if @loading do %>
  <p class="text-gray-400">Generating...</p>
<% else %>
  <p class="mt-4 italic"><%= @ai_continuation %></p>
<% end %>
Enter fullscreen mode Exit fullscreen mode

Want to get fancy? Stream characters back one by one.


Use Case: Predictive Pricing Dashboard

User selects a product.

Your Python model forecasts price:

def handle_event("select_product", %{"sku" => sku}, socket) do
  forecast = MyApp.Forecaster.predict(sku)
  {:noreply, assign(socket, forecast: forecast)}
end
Enter fullscreen mode Exit fullscreen mode

Render it with a chart using LiveView.ChartComponent or plotly.js via a hook.


Production Tips

  • βœ… Use Finch or Req for HTTP calls
  • βœ… Add API keys or OAuth to your Python server
  • βœ… Rate-limit or cache frequent predictions
  • βœ… Use Task.async to keep LiveView responsive
  • βœ… Log inputs/outputs for traceability

Need more speed?

Consider:

  • πŸ” Moving models into Elixir NIFs (Rustler, Zigler)
  • 🧠 Using Axon (Elixir-native ML) for simple inference
  • πŸ§ͺ Pre-batching predictions with GenServer queues

This Pattern Is Powerful

LiveView handles:

  • UI
  • Events
  • Realtime rendering

Python handles:

  • ML logic
  • Heavy compute
  • Prediction engines

🧠 The result:

Real-time, intelligent UIs β€” built with a lean team.


Learn More in my PDF Guide

πŸ”— Phoenix LiveView: The Pro’s Guide to Scalable Interfaces and UI Patterns

  • AI/ML integration with Python
  • Real-time dashboards and workflows
  • Background tasks, streaming, security
  • Examples for NLP, forecasting, assistants, and more

Download it now β€” and bring the power of Python into your LiveView stack without the JS baggage.

Top comments (0)