AI-Powered Dynamic Conversations With NPCs in Godot

Published: (December 24, 2025 at 02:41 PM EST)
6 min read
Source: Dev.to

Source: Dev.to

Overview

This is an example of how you can add dynamic AI‑generated responses to your Godot game based on arbitrary player input so that your NPCs can respond in character when talking to them.

You’ll need an LLM server that the game can query. In this tutorial we use a local Ollama server running on the same machine as the game. The workflow is:

  1. Take the player’s text input.
  2. Send a specially‑crafted prompt to the LLM.
  3. Display the LLM’s response on screen.
  4. Keep track of the conversation so each request includes enough context.

What you’ll need

  • Godot 4.x (the example uses 4.5.0)
  • Ollama (any model you like)

Setting Up Your Local LLM with Ollama

Install Ollama

Download Ollama from its official site:

For Linux you can run:

curl -fsSL https://ollama.com/install.sh | sh

Verify the installation:

ollama

Install a model

Browse the model library here:

For this example we use the small gemma3:1b model, but feel free to pick any model you prefer. Larger models give better quality at the cost of speed and resources.

ollama run gemma3:1b

When the command finishes you can chat with the model in the terminal. Press CTRL + D to exit.

Start the Ollama server

ollama serve

The server now listens on http://localhost:11434.

API request format

To generate a response, POST a JSON body to /api/generate:

{
  "model": "gemma3:1b",
  "prompt": "",
  "stream": false
}

Using Ollama Responses From a Godot Project

Scene setup

  1. Create a Node2D node called Npc.
  2. Add an HTTPRequest node as its child.
  3. Enable “Access as Unique Name” on the Npc node.

Create a new script named npc.gd and attach it to the Npc node.

Exported NPC properties

@export var age: int = 67
@export var gender: String = "male"
@export var profession: String = "farmer"
@export var name_: String = "Henry"

These can be edited in the inspector.

Conversation context

var context: String = ""

Function to query Ollama

func respondTo(dialogText: String) -> void:
    # Append the player’s line to the context
    context += "Player said: " + dialogText + "\n"

    var prompt := """
        You are an NPC in a medieval fantasy world named {name}, age {age},
        gender {gender}, profession {profession}. The player, a seasoned mercenary,
        says to you: {dialogText}. Respond in character, output only dialogue
        (no action lines or descriptions), and keep the reply to one or two paragraphs.
        Conversation context:
        {context}
    """.format({
        "name": name_,
        "age": age,
        "gender": gender,
        "profession": profession,
        "dialogText": dialogText,
        "context": context
    })

    var body := {
        "model": "gemma3:1b",
        "prompt": prompt,
        "stream": false
    }

    var json_body := JSON.stringify(body)
    $HTTPRequest.request(
        "http://localhost:11434/api/generate",
        ["Content-Type: application/json"],
        HTTPClient.METHOD_POST,
        json_body
    )

Handling the response

func _on_request_completed(result: int, response_code: int, headers: PackedStringArray, body: PackedByteArray) -> void:
    var parsed := JSON.parse_string(body.get_string_from_utf8())
    if parsed.error != OK:
        push_error("Failed to parse Ollama response")
        return

    var response_text := parsed.result.response
    %TextDisplay.text = response_text
    context += "You said: " + response_text + "\n"

Connect the signal

func _ready() -> void:
    $HTTPRequest.request_completed.connect(_on_request_completed)

UI for displaying the NPC’s reply

  1. Add a Control node to the scene.
  2. Inside it, add a RichTextLabel node named TextDisplay.
  3. Enable “Access as Unique Name” on TextDisplay.
  4. Position and size the label as you like.

Now you can obtain a reference to the Npc node from anywhere in your game and call:

$Npc.respondTo("Hello, traveler! What brings you to these woods?")

The NPC will answer using the LLM, and the conversation will be stored in context so future prompts retain memory of earlier exchanges.

Taking Text Input from the Player

We now need a way to allow the user to type a message for our NPC to respond to.

  1. Create a LineEdit node inside the Control node.
  2. Adjust its size and place it where you want on the screen.
  3. Attach a new script to the LineEdit called input.gd and add this:
func _process(delta):
    if Input.is_action_just_pressed("ui_submit"):
        %Npc.respondTo(text)
        text = ""
  1. Define ui_submit in Project → Project Settings → Input Map and map it to the physical Enter key.

Node Hierarchy

Here’s what the final node hierarchy will look like:

Node2D
├── Npc
│   └── HTTPRequest
└── Control
    ├── LineEdit
    └── TextDisplay

Try It Out

Run your scene, type anything into the input box, hit Enter, and watch the NPC respond.
The conversation can continue as long as you like (within your context‑window limit).
The dialogue should retain some consistency thanks to the passed context, though it won’t be 100 % perfect.

Example Dialogue

Player: Hey introduce yourself to me.

NPC: Henry. Henry Blackwood. Been tending this land for nigh on sixty years now, you see. My hands are weathered, but my heart… well, my heart still beats with the rhythm of the seasons. And you, sir? A mercenary, are you? A sharp eye, I’ve noticed. I’ve seen more men come and go than stones on a hillside. Don’t expect a lengthy welcome, though. Just… a quiet life, really.

Player: Yes, I am a mercenary. How did you know?

NPC: That’s a rather… bold claim, young one. Let’s just say my eyes have observed a certain restlessness in your stride. You carry yourself with a practiced alertness, a readiness that speaks of journeys undertaken. A man like you, seeking purpose beyond the harvest, wouldn’t be here to waste time with idle pleasantries.

Player: It is true, my purpose is beyond the harvest. I have a shield that needs a repair. Is there a blacksmith in this village?

NPC: Aye, there’s a smith in the village, just past the well. Though he’s a stubborn sort, takes his time. A simple iron plate, a good man, though he charges a decent price. Would you like me to bring you some ale while you wait?

Player: You have my gratitude.

NPC: Thank you, good sir. It is… a simple request, really. A warm ale would be welcome, though. It’s been a long day tending these fields, and a man needs a moment’s rest.

As you can see, some fine‑tuning of the prompt is necessary, but it’s a good start. Better models will perform much better than gemma3:1b.

Flesh It Out

You can go in many interesting directions from here. You could:

  • Add guardrails so the player is less likely to break the AI (a good intro to AI security).
  • Add features that let AI responses impact game state (grant items, heal the character, give quests).
  • Have the LLM decide what each NPC should be doing at any given moment based on variables like time of day.
  • Prompt the model to act as a sort of DM in a D&D campaign instead of a single NPC.

…and so on. The possibilities are basically endless—have fun exploring them!

Back to Blog

Related posts

Read more »