Python Chatbot On Raspberry Pi: Build a Working Rule-Based Bot on Bookworm

python chatbot on Raspberry pi

A Python chatbot on Raspberry Pi is a practical Python project that runs entirely on your hardware with no cloud API required. This guide builds a working rule-based chatbot using Python 3.11 and NLTK. These libraries install cleanly on Bookworm, unlike ChatterBot which is abandoned and fails to install on Python 3.11. The result is a chatbot that responds to user input through pattern matching, tokenisation, and a customisable response library. It runs in a terminal, can be extended to a web interface, and demonstrates the core Python concepts that underpin all natural language processing work.

Last tested: Raspberry Pi OS Bookworm Lite 64-bit | May 3, 2026 | Raspberry Pi 4 Model B (4GB) | Python 3.11.2 | NLTK 3.9

Key Takeaways

  • ChatterBot 1.0.4 is abandoned and fails to install on Python 3.11 with a dependency conflict. Do not follow guides that use it on Bookworm. NLTK is the correct current library for rule-based NLP on Pi. It is actively maintained, installs cleanly, and covers tokenisation, stemming, and pattern matching without requiring Python 3.8 or earlier.
  • Always use a virtual environment for chatbot projects. pip install nltk without a venv on Bookworm fails with “externally-managed-environment.” The venv also isolates the NLTK data downloads from the system Python.
  • If you want a real LLM chatbot running locally on a Pi 5, use llama.cpp with a quantised model rather than anything based on ChatterBot or rule patterns. The site already has a complete guide for that. See llama.cpp Raspberry Pi 5.

Python Chatbot On Raspberry Pi: What You Are Building

This guide builds a terminal chatbot that uses NLTK for text processing and a pattern-response dictionary for answers. The chatbot tokenises user input, normalises it (lowercase, punctuation removal), checks it against a set of patterns, and returns a matched response or a fallback. This is the same architecture used in the first generation of commercial chatbots. It is not AI, but it teaches the core concepts clearly and produces a working, interactive program.

The project then extends the basic bot with a simple Flask web interface so it is accessible from a browser on your network. The complete project is around 100 lines of Python. All dependencies are standard, actively maintained, and install on Bookworm Python 3.11 without issues.

setting up programming environment - python chatbot on raspberry pi

Setting Up the Environment

Flash Raspberry Pi OS Bookworm Lite 64-bit using Raspberry Pi Imager. After first boot and system update:

sudo apt update && sudo apt full-upgrade -y
sudo apt install python3-venv python3-pip -y

Create a virtual environment for the project:

mkdir -p ~/chatbot && cd ~/chatbot
python3 -m venv venv
source venv/bin/activate

# Confirm you are in the venv
which python3
# Should show: /home/pi/chatbot/venv/bin/python3

Install the required packages inside the venv:

pip install nltk flask

Download the NLTK data files the project uses:

python3 -c "
import nltk
nltk.download('punkt')
nltk.download('stopwords')
nltk.download('wordnet')
print('NLTK data ready')
"

Expected result: NLTK downloads three small data packages to ~/nltk_data/ and prints “NLTK data ready.” Total download size is under 20MB.

Building the Chatbot

Create ~/chatbot/bot.py:

import re
import random
from nltk.stem import WordNetLemmatizer
from nltk.tokenize import word_tokenize

lemmatizer = WordNetLemmatizer()

# Pattern-response pairs
# Each pattern is a regex, responses is a list (one chosen at random)
RESPONSES = [
    {
        "patterns": [r"hello|hi|hey|greetings"],
        "responses": [
            "Hello! How can I help you?",
            "Hi there! What are you working on?",
            "Hey! What can I do for you?"
        ]
    },
    {
        "patterns": [r"how are you|how do you feel|are you ok"],
        "responses": [
            "I am running fine on this Pi.",
            "All systems normal. Thanks for asking."
        ]
    },
    {
        "patterns": [r"your name|who are you|what are you"],
        "responses": [
            "I am PiBot, a rule-based chatbot running on Raspberry Pi.",
            "Call me PiBot. I run on Python and NLTK."
        ]
    },
    {
        "patterns": [r"weather"],
        "responses": [
            "I do not have internet access in this build. Check weather.com."
        ]
    },
    {
        "patterns": [r"raspberry pi|pi"],
        "responses": [
            "Raspberry Pi is a single-board computer made by the Raspberry Pi Foundation.",
            "I am running on a Raspberry Pi right now."
        ]
    },
    {
        "patterns": [r"bye|goodbye|exit|quit"],
        "responses": ["Goodbye!", "See you later.", "Bye!"]
    },
]

FALLBACK = [
    "I do not understand that. Can you rephrase?",
    "I am not sure how to respond to that.",
    "Could you ask that differently?"
]


def normalise(text):
    """Lowercase, remove punctuation, lemmatise tokens."""
    text = text.lower()
    text = re.sub(r"[^\w\s]", "", text)
    tokens = word_tokenize(text)
    tokens = [lemmatizer.lemmatize(t) for t in tokens]
    return " ".join(tokens)


def get_response(user_input):
    """Match normalised input against patterns and return a response."""
    normalised = normalise(user_input)
    for entry in RESPONSES:
        for pattern in entry["patterns"]:
            if re.search(pattern, normalised):
                return random.choice(entry["responses"])
    return random.choice(FALLBACK)


if __name__ == "__main__":
    print("PiBot ready. Type 'quit' to exit.")
    while True:
        user_input = input("You: ").strip()
        if not user_input:
            continue
        response = get_response(user_input)
        print(f"Bot: {response}")
        if any(re.search(p, normalise(user_input)) for p in [r"bye|goodbye|exit|quit"]):
            break

Run the chatbot:

python3 bot.py

Expected result: The bot starts, prints “PiBot ready. Type ‘quit’ to exit.” and responds to input. Try “hello”, “what is your name”, “tell me about raspberry pi”, and “bye”.

Adding a Web Interface with Flask

Create ~/chatbot/app.py to serve the chatbot over HTTP:

from flask import Flask, request, jsonify, render_template_string
from bot import get_response

app = Flask(__name__)

HTML = """
<!DOCTYPE html>
<html>
<head><title>PiBot</title></head>
<body>
<h2>PiBot</h2>
<div id="chat" style="height:300px;overflow:auto;border:1px solid #ccc;padding:10px"></div>
<input id="msg" type="text" placeholder="Type a message" style="width:80%">
<button onclick="send()">Send</button>
<script>
function send() {
  const msg = document.getElementById('msg').value;
  if (!msg) return;
  const chat = document.getElementById('chat');
  chat.innerHTML += '<p><b>You:</b> ' + msg + '</p>';
  fetch('/chat', {method:'POST', headers:{'Content-Type':'application/json'},
    body: JSON.stringify({message: msg})})
    .then(r => r.json())
    .then(d => { chat.innerHTML += '<p><b>Bot:</b> ' + d.response + '</p>';
                 chat.scrollTop = chat.scrollHeight; });
  document.getElementById('msg').value = '';
}
document.getElementById('msg').addEventListener('keydown', e => { if(e.key==='Enter') send(); });
</script>
</body>
</html>
"""

@app.route('/')
def index():
    return render_template_string(HTML)

@app.route('/chat', methods=['POST'])
def chat():
    data = request.get_json()
    user_message = data.get('message', '')
    response = get_response(user_message)
    return jsonify({'response': response})

if __name__ == '__main__':
    app.run(host='0.0.0.0', port=5001, debug=False)
python3 app.py

Expected result: Flask starts and reports listening on port 5001. From any browser on your network, navigate to http://<pi-ip>:5001 and the chat interface loads. Type a message and the bot responds via the JSON API.

Running the Chatbot as a Service

To run the web chatbot automatically at boot:

sudo nano /etc/systemd/system/pibot.service
[Unit]
Description=PiBot Chatbot
After=network.target

[Service]
User=pi
WorkingDirectory=/home/pi/chatbot
ExecStart=/home/pi/chatbot/venv/bin/python3 /home/pi/chatbot/app.py
Restart=on-failure
RestartSec=5

[Install]
WantedBy=multi-user.target
sudo systemctl daemon-reload
sudo systemctl enable --now pibot.service
sudo systemctl status pibot.service

Extending the Chatbot

Adding new responses

Add new entries to the RESPONSES list in bot.py. Each entry needs a patterns list of regex strings and a responses list. The pattern matching is case-insensitive after normalisation, so you do not need to handle capitalisation in the patterns.

# Example: add a time query response
{
    "patterns": [r"time|clock|what time"],
    "responses": [
        __import__('datetime').datetime.now().strftime("It is %H:%M.")
    ]
},

Adding context and conversation memory

The current bot is stateless. Each message is processed independently. To add basic context, maintain a conversation history list and include recent exchanges in the matching logic:

conversation_history = []

def get_response_with_context(user_input):
    conversation_history.append({"role": "user", "text": user_input})
    # Check last bot response for context-dependent patterns
    last_bot = conversation_history[-2]["text"] if len(conversation_history) > 1 else ""
    response = get_response(user_input)
    conversation_history.append({"role": "bot", "text": response})
    return response

Upgrading to a local LLM

If you want a chatbot that generates responses rather than matching patterns, the Pi 5 can run small quantised models via llama.cpp. A 1B parameter model (Llama 3.2 1B Q4) generates responses in 2 to 5 seconds on Pi 5. The rule-based bot in this guide is a better starting point for learning Python. The LLM approach requires significantly more setup. See llama.cpp Raspberry Pi 5 for the full LLM setup.

FAQ

Why not use ChatterBot?

ChatterBot 1.0.4, the version in most Pi chatbot guides, is abandoned. Its last release was in 2020. It requires Python 3.6 to 3.8 and fails to install on Python 3.11 (Bookworm default) due to broken dependency resolution in its setup files. Running it requires downgrading Python, which breaks other system tools. The NLTK-based approach in this guide uses actively maintained libraries that install cleanly on current Raspberry Pi OS.

Can this chatbot connect to the internet for answers?

Yes with additional code. Add a pattern that triggers an API request when matched. For example, a weather query triggers a call to the OpenWeatherMap API. Use the requests library inside the response function. The pattern-matching architecture makes it straightforward to add API-backed responses alongside the offline patterns.

Can the chatbot remember previous conversations?

Not across sessions in the current implementation. The conversation history list is in memory and clears when the script restarts. To persist history, write the history list to a JSON file at the end of each session and load it at startup. SQLite is a better option for larger histories. The built-in Python sqlite3 module handles this without any additional packages.

How is this different from running an actual AI model?

This chatbot uses pattern matching. It can only respond to inputs that match a regex in the RESPONSES list. An AI language model generates responses token by token from a statistical model trained on billions of words. The responses feel more natural and handle novel inputs, but require significantly more compute. A 1B parameter quantised model on Pi 5 uses 800MB to 1.5GB RAM and takes 2 to 5 seconds per response. The pattern bot uses under 50MB RAM and responds instantly. Both are useful depending on the project requirements.

Can I add voice input and output?

Yes. Use the speech_recognition library with a USB microphone for voice input, and the pyttsx3 or espeak tool for text-to-speech output. Both install in the virtual environment. Replacing the input() call in the main loop with a speech recognition call and the print() call with a speech synthesis call converts the terminal bot to a voice assistant. See Text to Speech Raspberry Pi Piper for the TTS setup.

References


About the Author

Chuck Wilson has been programming and building with computers since the Tandy 1000 era. His professional background includes CAD drafting, manufacturing line programming, and custom computer design. He runs PidiyLab in retirement, documenting Raspberry Pi and homelab projects that he actually deploys and maintains on real hardware. Every article on this site reflects hands-on testing on specific hardware and OS versions, not theoretical walkthroughs.

Last tested hardware: Raspberry Pi 4 Model B (4GB). Last tested OS: Raspberry Pi OS Bookworm Lite 64-bit. Python 3.11.2, NLTK 3.9, Flask 3.x.

Was this helpful?

Yes
No
Thanks for your feedback!