Local AI agents run on your machine. No cloud. No external APIs. Just you, your hardware, and the model. This post walks through the essentials: choosing a model, wiring it up with an agent framework, and running it locally. If you want privacy, speed, or control, this is how you get it.
What Can Local Agents Do?
Local agents can handle a wide range of tasks: summarizing documents, answering questions, automating workflows, scraping websites, or even acting as coding assistants.
In this post, we’ll focus on a simple task: scraping news headlines from a website and summarizing them. It’s fast, useful, and shows the core pieces in action.
Tools We’ll Use
- Ollama – run language models locally with one command. Gemma or Mistral work fine on a Laptop
- LangChain – structure reasoning, tools, and memory
- Python – glue everything together
Basic Structure of a Local Agent
- Model – the LLM doing the “thinking”
- Tools – code the agent can use (like a scraper or file reader)
- Prompt – instructions for what the agent should do
- Loop – let the agent think and act step-by-step
That’s it. The rest is just wiring.
Getting Started
- Install Ollama
https://ollama.combrew install ollamaor grab it for your OS. - Pull a model:
ollama run mistral - Set up a LangChain agent
Load the model via LangChain, define a tool, and pass it to the agent. You’ll see how in the example below.
The Code
pip install langchain beautifulsoup4 requests
ollama run mistral
Now make yourself a python script, such as run.py
from langchain.llms import Ollama
llm = Ollama(model="mistral")
The scraper:
import requests
from bs4 import BeautifulSoup
def get_headlines(url="https://www.bbc.com"):
res = requests.get(url)
soup = BeautifulSoup(res.text, "html.parser")
headlines = [h.get_text() for h in soup.find_all("h3")]
return "\n".join(headlines[:10]) # Just take top 10
Wrap it as a LangChain tool:
from langchain.agents import tool
@tool
def scrape_headlines() -> str:
"""Scrapes top headlines from BBC."""
return get_headlines()
Build the agent:
from langchain.agents import initialize_agent, AgentType
tools = [scrape_headlines]
agent = initialize_agent(
tools,
llm,
agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
verbose=True
)
Run the agent:
agent.run("Get the top news headlines and summarize them in a few bullet points.")
That’s it, you now have a local agent: scraping, thinking, and summarizing. All on your machine.