Self-Hosted AI in 2026: Automating Your Linux Workflow with Ollama and Bash

Published: (February 16, 2026 at 06:49 AM EST)
3 min read
Source: Dev.to

Source: Dev.to

Cover image for Self-Hosted AI in 2026: Automating Your Linux Workflow with Ollama and Bash

Why Self-Host AI for Automation?

  • Privacy – Your system logs, internal scripts, and server configurations never leave your machine.
  • Zero Latency – No round‑trips to external servers.
  • Cost – Run models like Llama 3 or Mistral for the cost of electricity.
  • Offline Capability – Your automation works even when the internet doesn’t.

Setting Up the Foundation: Ollama + systemd

Installation

curl -fsSL https://ollama.com/install.sh | sh

The systemd Service

Ollama’s installer usually creates a service for you. Verify it and enable it to start at boot:

sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama

Check the status:

systemctl status ollama

Leveling Up: The “AI‑SysAdmin” Bash Script

The ask-ai.sh Script

#!/bin/bash

# A simple wrapper to interact with local Ollama for Linux tasks
MODEL="llama3"   # You can use mistral, codellama, etc.
PROMPT="You are a senior Linux SysAdmin. Output only the bash commands needed to fulfill the request. No explanation unless asked."

if [ -z "$1" ]; then
    echo "Usage: ./ask-ai.sh 'Your request here'"
    exit 1
fi

REQUEST="$1"

# Combine context and request
FULL_PROMPT="$PROMPT\n\nRequest: $REQUEST"

# Call Ollama API via curl (local only)
RESPONSE=$(curl -s http://localhost:11434/api/generate -d "{
  \"model\": \"$MODEL\",
  \"prompt\": \"$FULL_PROMPT\",
  \"stream\": false
}")

# Extract response (requires jq)
COMMAND=$(echo "$RESPONSE" | jq -r '.response')

echo -e "\033[1;34mAI Suggestion:\033[0m"
echo "$COMMAND"

# Ask for confirmation before executing
read -p "Execute these commands? (y/N): " confirm
if [[ $confirm == [yY] || $confirm == [yY][eE][sS] ]]; then
    eval "$COMMAND"
else
    echo "Execution cancelled."
fi

How to Use It

  1. Make the script executable: chmod +x ask-ai.sh.
  2. Install jq (JSON parser): sudo apt install jq (or the equivalent for your distro).
  3. Run a request, e.g.:
./ask-ai.sh "Find all files larger than 100MB in /var/log and show their sizes"

Pro‑Tip: Scheduling Maintenance

Combine the script with cron or systemd timers for periodic maintenance. Example cron job that runs every Sunday at 3 AM:

0 3 * * 0 /path/to/ask-ai.sh "Check for broken symlinks in /usr/local/bin and log them to /var/log/ai_cleanup.log" --auto

(Add an --auto flag to the script to bypass the confirmation prompt for automated tasks.)

Summary and References

Self‑hosting AI isn’t just about chatting; it’s about building a smarter, more autonomous environment. By wrapping Ollama in Bash, you turn a language model into a functional member of your DevOps toolkit.

Sources & Further Reading

0 views
Back to Blog

Related posts

Read more »

Preface

Motivation I wanted to record my studies to have consistency. Since I don't directly learn building projects from my CS program, I want to be an expert in my a...