Dynamic UI for dynamic AI: Inside the emerging A2UI model
Source: VentureBeat
Introduction
Agentic AI enables businesses to operate more dynamically. Unlike traditional pre‑programmed bots and static rules, agents can “think” and devise alternate paths when faced with unseen conditions. Leveraging a business domain ontology such as FIBO (Financial Industry Business Ontology) helps keep agents within guardrails and prevents unwanted behavior.
The UX Bottleneck
While agents adapt to data drift guided by ontologies, the user experience (UX) layer often remains static. Fixed fields and configurations can restrict the creative freedom of agents. Modern standards like AG‑UI (Agent User Interface) streamline communication between UX and agents, but the screens still need to be pre‑defined at design time.
A2UI: Agent‑to‑User‑Interface
A newer technology, A2UI (Agent‑to‑User‑Interface), takes this a step further by allowing agents to render user screens dynamically based on content they generate.
- UX schema: A loosely coupled schema defines how components should be rendered.
- Renderer: Agents communicate with an A2UI‑compliant renderer that builds interactive screens from JSON produced on the fly.
- Bidirectional communication: Screens can send events back to the originating agents via AG‑UI.
Companies such as Copilotkit are building A2UI renderers that construct UI from JSON specifications and wire them back to agents through AG‑UI.
Compression with TOON
Newer compression standards like Token Object Notation (TOON) enable highly efficient encoding of JSON payloads, allowing ontologies and A2UI specifications to be included in context prompts. As models become smarter, they will be capable of auto‑generating screens that comply with A2UI and AG‑UI during pre‑training.
Architectural Overview
The schematic below illustrates one view of this architecture.
- Business ontology (e.g., FIBO) defines core concepts such as loans, parties, interest terms, covenants, and conditions.
- A2UI specification describes how UI components should be rendered for those concepts.
- Agents produce JSON payloads that conform to the A2UI schema.
- Renderer consumes the JSON, builds interactive screens, and maintains a connection to the originating agent via AG‑UI.
- Interaction loop: User actions (button clicks, form submissions) are sent back to the agent, enabling real‑time responses within a single pane of glass—often a traditional chatbot interface.
Benefits
- Single source of truth – Changes are made in the A2UI spec rather than in individual screens, ensuring consistency across the application.
- Reduced UI maintenance – Dynamic screens are generated on demand, lowering the burden on UX designers and UI developers.
- Regulatory compliance – UI components can be automatically rendered to meet standards (e.g., ISO 9241‑110) by embedding compliance rules in the ontology and A2UI spec.
- Scalability – Business‑wide updates (e.g., adding a new logo after an acquisition) propagate automatically through the spec, eliminating manual edits to thousands of forms.
- Improved productivity – Reusable components are built once and reused across multiple contexts, accelerating development cycles.
Example Use Case: Loan Approval
- Ontology layer: Defines entities such as
Loan,Borrower,InterestRate, andCovenant. - A2UI layer: Specifies that a loan‑approval form should render fields for each entity, display status messages in a branded panel, and enforce ISO‑compliant layout.
- Agent workflow:
- Agent retrieves loan data from disparate source systems.
- Agent generates an A2UI‑compliant JSON payload describing the form layout and pre‑populated values.
- Renderer builds the interactive form in the chat interface.
- User submits the form; the event is sent back to the agent via AG‑UI for processing.
Future Outlook
As AI models continue to evolve, we can expect:
- Auto‑generation of A2UI‑compliant screens during model pre‑training.
- Tighter integration of compression formats like TOON to embed richer context.
- Broader adoption of the A2UI‑AG‑UI stack across industries seeking dynamic, ontology‑driven user experiences.