AngleCore / ENGO Core and # AI Doesn’t Need Better Prompts. It Needs Better Patterns.

Published: (April 17, 2026 at 03:02 PM EDT)
21 min read
Source: Dev.to

Source: Dev.to

A Spatial AI System for Pattern-Based Workflows

What I Built

AngleCore (powered by ENGO Core) is a spatial interface for constructing and interpreting AI workflows through patterns instead of prompts. Most AI systems today rely on text input. This creates a bottleneck: Users must translate intent into language Language becomes ambiguous Systems respond inconsistently AngleCore removes that layer. Instead of writing prompts, users construct workflows visually using nodes that represent fundamental computational roles: Input Process Branch Memory Agent Output By interacting with these nodes in a spatial environment, users generate structured intent patterns that can be interpreted and extended by AI. This turns workflow design into something: Visual Iterative Reusable At its core, ENGO Core acts as the pattern engine behind this system, allowing workflows to be: Captured Interpreted Replayed Shared as templates OpenClaw was used as the execution and interpretation backbone of AngleCore. Rather than using AI as a direct output generator, I integrated OpenClaw as a pattern interpreter. Users create a sequence of nodes: INPUT → PROCESS → BRANCH → OUTPUT

Each node carries semantic meaning. The sequence itself becomes a structured representation of intent. When a pattern is created, it is transformed into a structured instruction: Pattern sequence: INPUT → PROCESS → BRANCH → OUTPUT

Tasks:

  1. Interpret workflow intent
  2. Name the workflow
  3. Suggest next node
  4. Evaluate coherence

OpenClaw processes this structure and returns: Workflow interpretation Suggested continuation Structural evaluation This is where OpenClaw shifts from being a tool to being a reasoning layer. The system enforces structured responses: JSON-based interpretation Named workflows Coherence scoring This ensures that outputs are not just readable — they are system-compatible and reusable. Each interpreted pattern becomes: A template A reusable workflow A potential automation unit ENGO Core is designed to evolve these patterns into: Executable pipelines AI-assisted decision flows Shareable system components Spawn nodes in the spatial field Connect nodes through interaction Build a traversal pattern Trigger interpretation Receive structured workflow analysis Dynamic node system Pattern trails Real-time interaction feedback AI interpretation panel (Video demo or live link recommended here) The biggest realization was that most AI limitations are not model-based. input design problems. By introducing structured patterning, output quality becomes significantly more consistent. Constructing workflows spatially: Reduces ambiguity Improves iteration speed Makes complex logic easier to reason about Using OpenClaw to: Evaluate Extend Validate …is far more powerful than using it to simply generate text. Patterns are not just temporary interactions. Stored Reused Shared Expanded This introduces the idea of workflow ecosystems, not just applications. Not attended. AngleCore is an early exploration of a broader idea: AI systems should not rely on better prompts. ENGO Core is the foundation for making that possible. This is a submission for the OpenClaw Writing Challenge We are not lacking intelligence. We are lacking structure. Most AI systems today operate on a simple loop: Input → Interpretation → Output

The assumption is that better prompts lead to better outputs. In reality: Prompts are inconsistent Interpretations vary Outputs are unpredictable The issue is not the AI. interface layer between humans and AI systems. OpenClaw introduces something important: It allows developers to move beyond raw prompting and into designed workflows. Instead of asking: “What should I say to get the right result?” We start asking: “What structure produces consistent results?” This is a fundamental shift. A prompt is a one-time instruction. A pattern is: Repeatable Structured Interpretable In systems like AngleCore (built using OpenClaw as the reasoning layer), patterns are constructed through: Node relationships Sequential logic Decision pathways Example: INPUT → PROCESS → BRANCH → OUTPUT

This is not a prompt. It is a workflow definition. Patterns introduce: The same structure produces predictable results. Patterns can be stored and reused across contexts. Systems can evolve by combining patterns instead of rewriting prompts. AI can reason about structure more effectively than free-form language. One of the most overlooked opportunities is how we treat knowledge. Today: Tutorials are static Documentation is passive In a pattern-based system: Tutorials become workflows Workflows become templates Templates become executable units With OpenClaw: A tutorial can act like an agent It can guide, adapt, and respond It becomes part of a living system When patterns are: Created Shared Interpreted You don’t just get tools. You get an ecosystem: Users contribute workflows Workflows evolve into systems Systems interact with each other This is where AI moves from: Tool → Platform Platform → Ecosystem The most important realization for me while building with OpenClaw was this: The bottleneck is no longer computation. Once intent is structured properly: AI becomes predictable Systems become scalable Workflows become composable We don’t need better prompts. We need: Better structure Better patterning Better ways of interacting with intelligence OpenClaw doesn’t just improve AI workflows. It opens the door to redefining how those workflows are created in the first place. Not attended. The next generation of AI systems will not be built on prompts. They will be built on patterns. And the teams that understand that early will define how these systems evolve. “How I Used OpenClaw” System Architecture Overview ┌──────────────┐ │ USER INPUT │ │ (Spatial UI) │ └──────┬───────┘ ↓ ┌────────────────────┐ │ PATTERN BUILDER │ │ (Node Sequences) │ └──────┬─────────────┘ ↓ ┌────────────────────┐ │ PATTERN TRANSLATOR │ │ (Structured Pr

ompt)│ └──────┬─────────────┘ ↓ ┌────────────────────┐ │ OPENCLAW AI │ │ (Interpretation) │ └──────┬─────────────┘ ↓ ┌────────────────────┐ │ STRUCTURED OUTPUT │ │ (Workflow Insight) │ └────────────────────┘

This shows: You’re not using AI directly You built a layered system around it

Spatial Pattern Construction [INPUT] ↓ [PROCESS] ──→ [MEMORY] ↓ ↓ [BRANCH] ─────→ [AGENT] ↓ [OUTPUT]

This represents: Non-linear thinking Multi-path workflows Real system logic (not linear prompts) Pattern → AI Translation Visual Pattern: INPUT → PROCESS → BRANCH → OUTPUT

↓ Translated into ↓

Structured Instruction: { “steps”: [ “INPUT”, “PROCESS”, “BRANCH”, “OUTPUT” ], “tasks”: [ “interpret”, “name”, “extend”, “evaluate” ] }

This is your core innovation visualized Interaction Loop [User Builds Pattern] ↓ [System Structures Intent] ↓ [OpenClaw Interprets] ↓ [User Refines Pattern] ↺

This shows: Iteration loop Learning system Not one-shot prompting ADD THIS TO PART 2 (Writing Post) Place under “From Prompts to Patterns” Prompt-Based System (Old Model) User Thought ↓ Text Prompt ↓ AI Guess ↓ Output (Unstable)

Problems: Ambiguity Inconsistency No structure Pattern-Based System (Your Model) User Intent ↓ Pattern Structure ↓ AI Interpretation ↓ Structured Output

Benefits: Clarity Repeatability Scalability Prompt vs Pattern (Side-by-Side) PROMPT SYSTEM PATTERN SYSTEM ────────────── ────────────── “Write code for X” INPUT → PROCESS → OUTPUT ↓ ↓ Ambiguous intent Structured intent ↓ ↓ Unpredictable output Interpretable workflow ↓ ↓ Hard to reuse Easily reusable

This one hits HARD. Keep it. Knowledge Evolution Model Tutorial (Static) ↓ Pattern (Structured) ↓ Template (Reusable) ↓ Agent (Executable)

This connects directly to your idea: “Tutorials become agents” Ecosystem Vision User A → Creates Pattern ↓ Stored as Template ↓ User B → Reuses + Modifies ↓ AI → Enhances Pattern ↓ System → Evolves

This shows: Network effects Platform thinking Why this scales Final Touch (Optional but Powerful) At the VERY end of Part 2, add this: The Shift Past: Interface → Tool → Output
Now: Interface → Pattern → Intelligence
Next: Pattern → Ecosystem → Autonomous Systems

A formal tutorial and system introduction aligned with the OpenClaw model of building through structured, composable intelligence. AngleCore is not a visualization tool. It is a spatial programming interface. The HTML system you built is effectively a runtime environment where thought becomes structure, and structure becomes a prompt. Instead of writing linear instructions, the user constructs node sequences in space. That sequence is then transformed into a machine-interpretable workflow. At its core, AngleCore operates on three principles: Spatial Encoding of Intent

Pattern as Prompt

Interpretation as Execution Layer

This is the shift: drawing intelligence paths. The HTML file defines a full client-side system composed of five tightly coupled layers. The dual canvas system: bg-canvas → static spatial grid (polar + radial logic) main-canvas → dynamic node simulation This is not aesthetic. The polar grid establishes: radial symmetry (decision branching) circular recursion (feedback loops) spatial anchoring (center = origin of logic) This turns the screen into a coordinate system for reasoning. Each node is not just a visual object. It is a typed unit of computation intent. Defined types: INPUT PROCESS OUTPUT BRANCH MEMORY AGENT Each node contains: identity (id) semantic role (type) behavior (velocity, interaction states) visual encoding (polygon sides, glyphs) pattern metadata (inPattern, patternOrder) This is effectively a graph-based DSL without text. User interaction creates meaning through: Click → focus + implicit inclusion in pattern Shift + Click → explicit pattern building Right-click → structural manipulation (node spawning, clustering) The critical structure: pattern = [N001, N004, N002, …]

This array is the core data product of the system. Everything else is scaffolding. The system uses lightweight physics: node repulsion boundary constraints velocity damping cluster generation This does two things: Prevents visual collapse Encourages emergent topology

Meaning is not just clicked — it forms organically. This is where the system aligns directly with the OpenClaw philosophy. The pattern is transformed into a structured prompt: Step 1: Node N001 [INPUT] Step 2: Node N004 [PROCESS] Step 3: Node N002 [OUTPUT]

Pattern: INPUT → PROCESS → OUTPUT

Then sent to an AI model. The AI returns: workflow name interpretation next node suggestion coherence score insight This is not execution. It is semantic compilation. AngleCore does not run workflows — discovers them. Most node-based systems: are deterministic require predefined logic operate as visual wrappers for code AngleCore breaks that model:

Traditional Systems AngleCore

Nodes execute logic Nodes represent intent

Graph defines output Graph is interpreted

Static workflows Emergent workflows

Developer-defined meaning AI-inferred meaning

This is closer to: cognitive mapping symbolic reasoning proto-agent orchestration OpenClaw is fundamentally about: composability modular intelligence structured workflows interpretable systems AngleCore uses OpenClaw not as a backend tool, but as a philosophical layer: The node pattern = a composable unit The structured prompt = a portable workflow The interpretation = a reusable insight object This makes AngleCore: an interface for generating OpenClaw-compatible workflows without writing them manually Users map out proce

sses visually, and the system names and formalizes them. Instead of writing pipelines, users draw agent interactions. Patterns reveal: missing steps redundant loops broken logic chains Clearing agents, analysts, or operators can simulate: input → validation → processing → output chains without writing a single script. There are three non-obvious innovations in this system. This is the real breakthrough. You are not prompting AI directly. constructing prompts through interaction. That unlocks: non-technical users visual reasoning system-level thinking Nodes don’t execute. They represent intent categories. That allows: abstraction flexibility reinterpretation across contexts Most systems use AI to: generate automate predict AngleCore uses AI to: understand structure assign meaning suggest evolution This is closer to: AI as a reasoning partner, not a tool Use the spawn button or right-click to create nodes. Click nodes in sequence. Use Shift + Click for controlled pattern construction. The system displays: INPUT → PROCESS → OUTPUT

This is your implicit workflow. Click “Interpret Pattern”. The system: builds a structured prompt sends it to the AI returns a semantic breakdown Add nodes, branch paths, test variations. You are not building code. possibility space. AngleCore is an early form of something larger: spatial programming environments AI-native interfaces intent-driven system design If extended, this becomes: multi-agent orchestration layer real-time workflow compiler visual language for AI systems AngleCore is not a product yet. It is a new interaction model: A system where users don’t write workflows — trace them, and AI makes them legible. That aligns directly with the direction OpenClaw is pushing: less syntax more structure composable intelligence Not attended. Focus remained on building and formalizing the system architecture. This is Phase I. What matters is not the interface — interaction itself can be compiled into intelligence.

AngleCore — Spatial AI System

@import url(‘https://fonts.googleapis.com/css2?family=Syne:wght@400;600;800&family=Space+Mono:wght@400;700&display=swap’);

:root { —void: #02040a; —deep: #060b14; —field: #0a1120; —grid: rgba(30,80,160,0.12); —node-idle: rgba(20,60,140,0.7); —node-hover: rgba(40,120,255,0.9); —node-focus: rgba(80,200,255,1); —node-active: rgba(0,255,180,1); —edge: rgba(40,100,200,0.3); —edge-hot: rgba(0,220,180,0.7); —text-dim: rgba(100,160,255,0.5); —text-bright: rgba(180,220,255,0.95); —text-gold: rgba(255,200,80,0.9); —accent: rgba(0,255,180,0.8); —danger: rgba(255,60,80,0.8); —glow: 0 0 20px rgba(40,120,255,0.4); —glow-hot: 0 0 40px rgba(0,255,180,0.6); }

  • { margin: 0; padding: 0; box-sizing: border-box; }

body { background: var(—void); font-family: ‘Space Mono’, monospace; color: var(—text-bright); overflow: hidden; height: 100vh; width: 100vw; cursor: crosshair; user-select: none; }

#field { position: absolute; inset: 0; overflow: hidden; }

#bg-canvas { position: absolute; inset: 0; opacity: 0.6; }

#main-canvas { position: absolute; inset: 0; }

/* HUD */ #hud { position: absolute; top: 0; left: 0; right: 0; display: flex; justify-content: space-between; align-items: flex-start; padding: 20px 28px; pointer-events: none; z-index: 10; }

#logo { font-family: ‘Syne’, sans-serif; font-weight: 800; font-size: 22px; letter-spacing: 4px; color: var(—text-bright); text-transform: uppercase; pointer-events: auto; } #logo span { color: var(—accent); }

#phase-badge { font-size: 9px; letter-spacing: 3px; color: var(—text-dim); margin-top: 4px; text-transform: uppercase; }

#status-bar { text-align: right; font-size: 10px; color: var(—text-dim); letter-spacing: 1px; line-height: 1.8; }

#status-bar .val { color: var(—accent); }

/* Pattern trail display */ #pattern-display { position: absolute; bottom: 20px; left: 28px; right: 28px; display: flex; align-items: flex-end; gap: 0; pointer-events: none; z-index: 10; }

#pattern-chain { display: flex; align-items: center; gap: 6px; flex-wrap: wrap; flex: 1; }

.pattern-node-badge { font-size: 9px; letter-spacing: 2px; padding: 3px 8px; border: 1px solid rgba(0,255,180,0.3); color: var(—accent); background: rgba(0,255,180,0.05); font-family: ‘Space Mono’, monospace; transition: all 0.3s; }

.pattern-arrow { color: var(—text-dim); font-size: 9px; }

/* AI Response Panel */ #ai-panel { position: absolute; right: 28px; top: 50%; transform: translateY(-50%); width: 300px; background: rgba(6,11,20,0.95); border: 1px solid rgba(40,100,200,0.3); border-left: 2px solid var(—accent); padding: 20px; z-index: 20; display: none; box-shadow: -10px 0 60px rgba(0,0,0,0.8), inset 0 0 40px rgba(0,20,60,0.5); max-height: 70vh; overflow-y: auto; }

#ai-panel.visible { display: block; animation: slideIn 0.3s ease; }

@keyframes slideIn { from { opacity: 0; transform: translateY(-50%) translateX(20px); } to { opacity: 1; transform: translateY(-50%) translateX(0); } }

#ai-panel-header { font-family: ‘Syne’, sans-serif; font-size: 10px; letter-spacing: 3px; color: var(—text-dim); text-transform: uppercase; margin-bottom: 12px; display: flex; justify-content: space-between; align-items: center; }

#ai-panel-close { cursor: pointer; color: var(—text-dim); font-size: 14px; pointer-events: auto;

transition: color 0.2s; } #ai-panel-close:hover { color: var(—danger); }

#ai-response-text { font-size: 11px; line-height: 1.8; color: var(—text-bright); white-space: pre-wrap; }

#ai-response-text .thinking { color: var(—text-dim); font-style: italic; }

#ai-prompt-preview { margin-top: 16px; padding: 10px; background: rgba(0,30,80,0.5); border-left: 2px solid rgba(40,100,200,0.4); font-size: 9px; color: var(—text-dim); line-height: 1.6; }

/* Context menu */ #context-menu { position: absolute; background: rgba(6,11,20,0.97); border: 1px solid rgba(40,100,200,0.4); padding: 6px 0; z-index: 30; display: none; min-width: 160px; box-shadow: 0 10px 40px rgba(0,0,0,0.8); }

#context-menu.visible { display: block; }

.ctx-item { padding: 7px 16px; font-size: 10px; letter-spacing: 1px; color: var(—text-dim); cursor: pointer; transition: all 0.15s; } .ctx-item:hover { background: rgba(0,255,180,0.08); color: var(—accent); }

.ctx-divider { height: 1px; background: rgba(40,100,200,0.2); margin: 4px 0; }

/* Node label tooltip */ #node-tooltip { position: absolute; background: rgba(6,11,20,0.95); border: 1px solid rgba(40,100,200,0.3); padding: 6px 12px; font-size: 9px; letter-spacing: 2px; color: var(—text-bright); pointer-events: none; z-index: 25; display: none; text-transform: uppercase; }

/* Spawn button */ #spawn-btn { position: absolute; bottom: 70px; right: 28px; background: transparent; border: 1px solid rgba(0,255,180,0.4); color: var(—accent); font-family: ‘Space Mono’, monospace; font-size: 9px; letter-spacing: 3px; padding: 10px 18px; cursor: pointer; z-index: 10; transition: all 0.2s; text-transform: uppercase; } #spawn-btn:hover { background: rgba(0,255,180,0.1); box-shadow: 0 0 20px rgba(0,255,180,0.3); }

#interpret-btn { position: absolute; bottom: 110px; right: 28px; background: transparent; border: 1px solid rgba(40,120,255,0.4); color: rgba(140,180,255,0.9); font-family: ‘Space Mono’, monospace; font-size: 9px; letter-spacing: 3px; padding: 10px 18px; cursor: pointer; z-index: 10; transition: all 0.2s; text-transform: uppercase; display: none; } #interpret-btn.visible { display: block; } #interpret-btn:hover { background: rgba(40,120,255,0.1); box-shadow: 0 0 20px rgba(40,120,255,0.3); } #interpret-btn:disabled { opacity: 0.4; cursor: not-allowed; }

#clear-btn { position: absolute; bottom: 20px; right: 28px; background: transparent; border: 1px solid rgba(255,60,80,0.2); color: rgba(255,60,80,0.5); font-family: ‘Space Mono’, monospace; font-size: 9px; letter-spacing: 3px; padding: 8px 14px; cursor: pointer; z-index: 10; transition: all 0.2s; text-transform: uppercase; } #clear-btn:hover { border-color: rgba(255,60,80,0.5); color: rgba(255,60,80,0.9); background: rgba(255,60,80,0.05); }

#loading-overlay { position: absolute; inset: 0; background: rgba(2,4,10,0.85); z-index: 50; display: flex; flex-direction: column; align-items: center; justify-content: center; gap: 12px; }

#loading-overlay.hidden { display: none; }

.loading-ring { width: 48px; height: 48px; border: 2px solid rgba(40,120,255,0.2); border-top-color: var(—accent); border-radius: 50%; animation: spin 0.8s linear infinite; }

@keyframes spin { to { transform: rotate(360deg); } }

.loading-text { font-size: 9px; letter-spacing: 4px; color: var(—text-dim); text-transform: uppercase; }

/* Scrollbar */ #ai-panel::-webkit-scrollbar { width: 3px; } #ai-panel::-webkit-scrollbar-track { background: transparent; } #ai-panel::-webkit-scrollbar-thumb { background: rgba(40,100,200,0.3); }

AngleCore
Spatial AI System · Phase I


NODES: 0  
PATTERN: —

FOCUS: NONE  
FIELD: OPEN





AI Interpretation

+ Spawn Node Here ⬡ Spawn Cluster

◎ Clear Pattern ↺ Reset Field

⊕ Spawn Node ◈ Interpret Pattern ✕ Clear

Interpreting Pattern

// ============================================================ // ANGLECORE — Spatial AI System — Phase I // ============================================================

const bgCanvas = document.getElementById(‘bg-canvas’); const mainCanvas = document.getElementById(‘main-canvas’); const bgCtx = bgCanvas.getContext(‘2d’); const ctx = mainCanvas.getContext(‘2d’);

// State let W, H, cx, cy; let nodes = []; l

et edges = []; let pattern = []; // sequence of clicked node ids let focusedNode = null; let hoveredNode = null; let mouseX = 0, mouseY = 0; let panX = 0, panY = 0; let isPanning = false; let panStartX, panStartY; let zoom = 1; let frameCount = 0; let contextMenuOpen = false; let contextX = 0, contextY = 0;

const NODE_TYPES = [ { label: ‘INPUT’, color: ‘#1e6aff’, glyph: ’▷’, sides: 3 }, { label: ‘PROCESS’, color: ‘#00ffd4’, glyph: ’◈’, sides: 6 }, { label: ‘OUTPUT’, color: ‘#ffb800’, glyph: ’◉’, sides: 4 }, { label: ‘BRANCH’, color: ‘#ff3c64’, glyph: ’⬡’, sides: 5 }, { label: ‘MEMORY’, color: ‘#9b6fff’, glyph: ’∿’, sides: 8 }, { label: ‘AGENT’, color: ‘#00ff9d’, glyph: ’⊛’, sides: 7 }, ];

// ---- Resize ---- function resize() { W = window.innerWidth; H = window.innerHeight; cx = W/2; cy = H/2; bgCanvas.width = mainCanvas.width = W; bgCanvas.height = mainCanvas.height = H; drawBackground(); }

// ---- Background: polar grid ---- function drawBackground() { bgCtx.clearRect(0,0,W,H);

// Deep void gradient const grad = bgCtx.createRadialGradient(cx,cy,0, cx,cy, Math.max(W,H)*0.7); grad.addColorStop(0, ‘rgba(8,18,40,1)’); grad.addColorStop(0.5, ‘rgba(4,10,22,1)’); grad.addColorStop(1, ‘rgba(2,4,10,1)’); bgCtx.fillStyle = grad; bgCtx.fillRect(0,0,W,H);

// Polar rings bgCtx.strokeStyle = ‘rgba(20,60,160,0.12)’; bgCtx.lineWidth = 1; for (let r = 80; r = 0) { const bx = x + r0.7, by = y - r0.7; ctx.fillStyle = ‘rgba(0,255,180,0.9)’; ctx.beginPath(); ctx.arc(bx,by,8,0,Math.PI*2); ctx.fill(); ctx.font = ‘bold 8px Space Mono’; ctx.fillStyle = ‘#000’; ctx.fillText(node.patternOrder+1, bx, by); }

ctx.restore(); }

// ---- Draw edge ---- function drawEdge(edge) { const a = nodes.find(n=>n.id===edge.from); const b = nodes.find(n=>n.id===edge.to); if (!a||!b) return;

const dx = b.x-a.x, dy = b.y-a.y; const dist = Math.sqrt(dxdx+dydy); if (dist n.id===pattern[i]); const b = nodes.find(n=>n.id===pattern[i+1]); if (!a||!b) continue;

const prog = i/(pattern.length-1);
const alpha = 0.4 + 0.4*prog;

ctx.strokeStyle = `rgba(0,255,180,${alpha})`;
ctx.lineWidth = 2 - prog;
ctx.setLineDash([5,8]);
ctx.shadowColor = 'rgba(0,255,180,0.3)';
ctx.shadowBlur = 6;

ctx.beginPath();
ctx.moveTo(a.x, a.y);
ctx.lineTo(b.x, b.y);
ctx.stroke();
ctx.setLineDash([]);

}

ctx.restore(); }

// ---- Main render loop ---- function render() { ctx.clearRect(0,0,W,H); frameCount++;

// Physics for (const node of nodes) { // Fade in node.opacity = Math.min(1, node.opacity + 0.04);

// Rotate
node.angle += node.angleSpeed;

// Gentle float
node.x += node.vx;
node.y += node.vy;

// Boundary repulsion
const margin = 60;
if (node.x  W-margin) node.vx -= 0.08;
if (node.y  H-margin) node.vy -= 0.08;

// Damping
node.vx *= 0.98;
node.vy *= 0.98;

// Node-node repulsion
for (const other of nodes) {
  if (other.id === node.id) continue;
  const dx = node.x - other.x;
  const dy = node.y - other.y;
  const d = Math.sqrt(dx*dx+dy*dy);
  const minD = node.radius + other.radius + 30;
  if (d  0.5) {
    const f = (minD-d)/minD * 0.15;
    node.vx += (dx/d)*f;
    node.vy += (dy/d)*f;
  }
}

// Mouse proximity
const mdx = mouseX - node.x;
const mdy = mouseY - node.y;
const md = Math.sqrt(mdx*mdx+mdy*mdy);
const expandZone = 100;

if (md (a.state==='focused'?1:-1)-(b.state==='focused'?1:-1));

for (const node of sorted) drawNode(node);

// Cursor indicator ctx.save(); ctx.strokeStyle = hoveredNode ? ‘rgba(0,255,180,0.6)’ : ‘rgba(40,100,200,0.3)’; ctx.lineWidth = 1; ctx.beginPath(); ctx.arc(mouseX, mouseY, hoveredNode ? 20 : 10, 0, Math.PI2); ctx.stroke(); if (!hoveredNode) { ctx.strokeStyle = ‘rgba(40,100,200,0.15)’; ctx.beginPath(); ctx.arc(mouseX, mouseY, 40, 0, Math.PI2); ctx.stroke(); } ctx.restore();

// Update HUD updateHUD();

requestAnimationFrame(render); }

function updateHUD() { document.getElementById(‘stat-nodes’).textContent = nodes.length; document.getElementById(‘stat-pattern’).textContent = pattern.length > 0 ? pattern.join(’→’) : ’—’; document.getElementById(‘stat-focus’).textContent = focusedNode ? focusedNode.id : ‘NONE’;

// Pattern chain display const chain = document.getElementById(‘pattern-chain’); if (pattern.length !== chain.children.length / 2 + (chain.children.length > 0 ? 0.5 : 0)) { chain.innerHTML = ”; pattern.forEach((id, i) => { if (i > 0) { const arr = document.createElement(‘span’); arr.className = ‘pattern-arrow’; arr.textContent = ’→’; chain.appendChild(arr); }

  const badge = document.createElement('span');
  badge.className = 'pattern-node-badge';
  const n = nodes.find(x=>x.id===id);
  badge.textContent = n ? `${id} ${n.type.glyph}` : id;
  chain.appendChild(badge);
});

} }

// ---- Input ---- mainCanvas.addEventListener(‘mousemove’, e => { mouseX = e.clientX; mouseY = e.clientY;

const tooltip = document.getElementById(‘node-tooltip’); if (hoveredNode) { tooltip.style.display = ‘block’; tooltip.style.left = (e.clientX+16)+‘px’; tooltip.style.top = (e.clientY-10)+‘px’; tooltip.textContent = ${hoveredNode.id} · ${hoveredNode.type.label}; } else { tooltip.style.display = ‘none’; } });

mainCanvas.addEventListener(‘click’, e => { if (contextMenuOpen) { closeContextMenu(); return; }

if (hoveredNode) { // Add to pattern or toggle focus if (e.shiftKey) { // Shift+click = add to pattern if (!pattern.includes(hoveredNode.id)) { pattern.push(hoveredNode.id); hoveredNode.inPattern = true; hoveredNode.patternOrder = pattern.length-1; updateEdgeHotness(); } } else { // Click = set focus if (focusedNode) { focusedNode.state = ‘idle’; focusedNode = null; } hoveredNode.state = ‘focused’; focusedNode = hoveredNode;

  // Auto-add to pattern
  if (!pattern.includes(hoveredNode.id)) {
    pattern.push(hoveredNode.id);
    hoveredNode.inPattern = true;
    hoveredNode.patternOrder = pattern.length-1;
    updateEdgeHotness();
  }
}

// Show interpret button if pattern >= 2
document.getElementById('interpret-btn').className = pattern.length >= 2 ? 'visible' : '';

} });

mainCanvas.addEventListener(‘contextmenu’, e => { e.preventDefault(); contextX = e.clientX; contextY = e.clientY; const menu = document.getElementById(‘context-menu’); menu.style.left = contextX+‘px’; menu.style.top = contextY+‘px’; menu.className = ‘visible’; contextMenuOpen = true; });

function closeContextMenu() { document.getElementById(‘context-menu’).className = ”; contextMenuOpen = false; }

document.addEventListener(‘keydown’, e => { if (e.key === ‘Escape’) { closeContextMenu(); if (focusedNode) { focusedNode.state = ‘idle’; focusedNode = null; } } if (e.key === ‘n’ || e.key === ‘N’) spawnNodeAtMouse(); if (e.key === ‘c’ || e.key === ‘C’) clearPattern(); });

function updateEdgeHotness() { // Mark edges along pattern as hot for (const edge of edges) { edge.hot = false; for (let i = 0; i { const n = nodes.find(x=>x.id===id); return n ? { id: n.id, type: n.type.label, glyph: n.type.glyph } : { id }; });

const structuredPrompt = `You are the AI core of AngleCore, a spatial workflow system.

The user has constructed a node traversal pattern on the spatial field: ${patternNodes.map((n,i)=> Step ${i+1}: Node ${n.id} [${n.type}] ${n.glyph}).join(‘\n’)}

Pattern sequence: ${patternNodes.map(n=>${n.type}).join(’ → ’)}

Based on this spatial interaction pattern:

  1. Interpret what workflow or intent this pattern represents (2-3 sentences)
  2. Name this workflow pattern (short, evocative name)
  3. Suggest the next logical node type to add
  4. Rate the pattern coherence 1-10

Respond in this exact JSON format: { “workflow_name”: ”…”, “interpretation”: ”…”, “next_node”: ”…”, “coherence”: 8, “insight”: ”…” }`;

const aiPanel = document.getElementById(‘ai-panel’); const responseText = document.getElementById(‘ai-response-text’); const promptPreview = document.getElementById(‘ai-prompt-preview’);

try { const response = await fetch(‘https://api.anthropic.com/v1/messages’, { method: ‘POST’, headers: { ‘Content-Type’: ‘application/json’ }, body: JSON.stringify({ model: ‘claude-sonnet-4-20250514’, max_tokens: 1000, messages: [{ role: ‘user’, content: structuredPrompt }] }) });

const data = await response.json();
const raw = data.content.map(i=>i.text||'').join('');
let parsed;

try {
  const clean = raw.replace(/```

{% endraw %} json| {% raw %}

      parsed = JSON.parse(clean);
    } catch {
      parsed = null;
    }

    document.getElementById('loading-overlay').className = 'hidden';
    aiPanel.className = 'visible';

    if (parsed) {
      responseText.innerHTML = `
${parsed.workflow_name}
${parsed.interpretation}
NEXT SUGGESTED NODE
${parsed.next_node}
INSIGHT
${parsed.insight}

  COHERENCE
  
    
  
  ${parsed.coherence}/10
`;
    } else {
      responseText.innerHTML = `${raw}`;
    }

    promptPreview.innerHTML = `STRUCTURED PROMPT${patternNodes.map(n=>`${n.type}`).join(' → ')}`;

  } catch (err) {
    document.getElementById('loading-overlay').className = 'hidden';
    aiPanel.className = 'visible';
    responseText.innerHTML = `Connection error. Check API access.

${err.message}`;
  }

  btn.disabled = false;
}

// ---- Buttons & Context ----
document.getElementById('spawn-btn').addEventListener('click', () => spawnNodeAtMouse(cx + (Math.random()-0.5)*200, cy + (Math.random()-0.5)*200));
document.getElementById('interpret-btn').addEventListener('click', interpretPattern);
document.getElementById('clear-btn').addEventListener('click', clearPattern);
document.getElementById('ai-panel-close').addEventListener('click', () => {
  document.getElementById('ai-panel').className = '';
});

document.getElementById('ctx-add-node').addEventListener('click', () => {
  spawnNodeAtMouse(contextX, contextY);
  closeContextMenu();
});
document.getElementById('ctx-add-cluster').addEventListener('click', () => {
  spawnCluster(contextX, contextY);
  closeContextMenu();
});
document.getElementById('ctx-clear-pattern').addEventListener('click', () => { clearPattern(); closeContextMenu(); });
document.getElementById('ctx-reset-field').addEventListener('click', () => { resetField(); closeContextMenu(); });

// ---- Utility ----
function hexToRgba(hex, alpha) {
  const r = parseInt(hex.slice(1,3),16);
  const g = parseInt(hex.slice(3,5),16);
  const b = parseInt(hex.slice(5,7),16);
  return `rgba(${r},${g},${b},${alpha})`;
}

// ---- Init ----
function initField() {
  // Spawn initial nodes in spiral arrangement
  const count = 7;
  for (let i = 0; i  { resize(); });
resize();
initField();
render();
0 views
Back to Blog

Related posts

Read more »