Gemini, Local and Free with Chrome and Angular
Source: Dev.to
Available APIs
- Language API – Detects the language of a given text.
- Translation API – Translates text from one language to another.
- Prompt API – Accepts free‑form prompts for structured outputs.
- Summarization API – Condenses long text into concise summaries.
In this post, I’ll show how you can take advantage of these APIs to add genuinely useful features without paying for tokens.
One thing I find particularly interesting is how Google Search now often gives you an AI‑generated summary or even a direct answer. Sometimes you just want the takeaway, not the entire article. That idea is exactly what we’re going to replicate here.
Google Search Summary feature
Goal: Build a small piece of logic that generates a TL;DR (too long to read) section in bullet points for a blog post, so readers can instantly understand what it’s about.
Getting Started
-
Update Chrome – Make sure you are running the latest version. Older versions may not include the most recent on‑device models.
-
Enable required flags and restart Chrome:
chrome://flags/#optimization-guide-on-device-model chrome://flags/#prompt-api-for-gemini-nano-multimodal-input -
Install the models by running the code below in the console (or a script that has access to the Chrome APIs):
const session = await LanguageModel.create({ monitor(m) { m.addEventListener('downloadprogress', (e) => { console.log(`Downloaded ${e.loaded * 100}%`); }); }, }); const summarizer = await Summarizer.create({ monitor(m) { m.addEventListener('downloadprogress', (e) => { console.log(`Downloaded ${e.loaded * 100}%`); }); }, });When the download reaches 100 %, you are ready to use the APIs.
Implementation
We’ll use Angular for the implementation. If you don’t already have it set up:
# Install the Angular CLI
npm install -g @angular/cli
# Create a new project (replace <name> with your desired name)
ng new <name>
UI Prompt (for Antigravity IDE)
Below is the prompt we fed to Google’s new IDE, Antigravity, to generate the UI component:
You are an expert Frontend Developer. Your task is to build a premium, visually stunning blog post page for an Angular application. The project is already initialized.
### Goal
Create a standalone component named `BlogPost` that serves as a static blog post page. The design should be modern, "dark mode" by default, and evoke a high‑tech, futuristic feel suitable for the topic of "Agentic AI".
### Structure & Content Constraints
The page must contain the following specific elements, stacked vertically:
1. **Header Section**
- **Tags**: A row of small pill‑shaped tags: "Artificial Intelligence", "Future", "Tech".
- **Title**: Large, impactful typography: "The Rise of Agentic AI: A New Era of Coding".
- **Subtitle**: A lighter sub‑heading: "How autonomous agents are transforming the software development landscape".
2. **TL;DR Section**
- Placed prominently below the header but before the main content.
- Must clearly stand out (e.g., border, different background tint, or accent color).
- **Heading**: "TL;DR".
- **Content**: A bulleted list summarizing the article (e.g., AI moving from autocomplete to autonomy, changing developer roles to architects).
3. **Main Content**
- Several paragraphs discussing "Agentic AI".
- Explain how it differs from traditional coding assistants.
- Discuss the shift from "writing code" to "guiding agents".
- Use highly readable typography with good line height and contrast.
### Design & Aesthetics (Crucial)
- **Theme**: Dark mode. Background very dark (nearly black), text light grey/white.
- **Typography**: Clean sans‑serif font like *Inter*.
- **Color Palette**: Neon/electric accents.
- *Primary Accent*: Electric teal or cyan (for tags/highlights).
- *Secondary Accent*: Electric purple (for the TL;DR section or links).
- **Visual Style**:
- Blog post container looks like a "card" floating in the center with subtle shadow and rounded corners.
- Subtle gradients for the title if possible.
- Fully responsive (mobile‑friendly).
### Technical Requirements
- Use Angular **Standalone Components**.
- Hardcode all text directly in the template or component class.
- **Do NOT** implement any actual AI calls or backend services; this is purely a UI implementation task.
Resulting UI Mockup
At this point, the focus is purely on UI. No AI calls yet.
Implementing the APIs
Create a new Angular service that will wrap the on‑device APIs. Below is a clean starter template:
import { Injectable } from '@angular/core';
// Types for the Chrome on‑device APIs (replace with actual typings if available)
declare const LanguageModel: any;
declare const Summarizer: any;
@Injectable({
providedIn: 'root',
})
export class OnDeviceAiService {
private languageModel: any;
private summarizer: any;
constructor() {
this.initModels();
}
/** Load the on‑device models */
private async initModels(): Promise {
this.languageModel = await LanguageModel.create({
monitor: (m: any) => {
m.addEventListener('downloadprogress', (e: any) => {
console.log(`Language model download: ${e.loaded * 100}%`);
});
},
});
this.summarizer = await Summarizer.create({
monitor: (m: any) => {
m.addEventListener('downloadprogress', (e: any) => {
console.log(`Summarizer download: ${e.loaded * 100}%`);
});
},
});
}
/** Detect language of a string */
async detectLanguage(text: string): Promise {
if (!this.languageModel) {
await this.initModels();
}
return this.languageModel.detectLanguage(text);
}
/** Summarize a block of text */
async summarize(text: string): Promise {
if (!this.summarizer) {
await this.initModels();
}
return this.summarizer.summarize(text);
}
}
You can now inject OnDeviceAiService into any component (e.g., the BlogPost component) and call summarize() to generate the TL;DR bullet list.
Next Steps
- Wire the service into the
BlogPostcomponent and display the generated bullets in the TL;DR section. - Add error handling for cases where the model fails to load.
- Experiment with the Prompt API for more structured outputs (e.g., JSON‑formatted summaries).
That’s it! You now have a zero‑cost, on‑device AI pipeline ready to enrich your web apps without burning through token budgets. Happy coding!
AI‑Powered TL;DR Generation with Chrome’s Summarizer & Prompt APIs
Below is a complete, cleaned‑up guide for creating a service that extracts key points from an article and renders them as a TL;DR list in an Angular component. The code uses Chrome’s Summarizer API (stable) and Prompt API (experimental) – both run locally, so no tokens or costs are incurred.
1. Create a Summarizer Session
The Summarizer extracts the most important points from a piece of text.
private async createSummarizerSession() {
return await Summarizer.create({
type: 'key-points', // Determines the summarisation style
length: 'short', // Short summary
expectedInputLanguages: ['en'], // Input language(s)
outputLanguage: 'en', // Output language
expectedContextLanguages: ['en'],
sharedContext: 'About AI and Agentic AI'
});
}
Note: The
typefield drives how the summary is generated.

For a full list of options, see the official docs.
2. Create a Prompt (Language‑Model) Session
We’ll use the Prompt API to turn the raw bullet points into clean HTML.
private async createLanguageModelSession() {
return await LanguageModel.create({
initialPrompts: [
{
role: 'system',
content: 'Convert these bullets into HTML. Return only the HTML.'
}
],
});
}
3. Combine Both Sessions
The function below ties everything together:
async generateTlDr(content: string): Promise {
// 1️⃣ Summarise the article
const summarizer = await this.createSummarizerSession();
const summary = await summarizer.summarize(content, {
context: 'This article is intended for a tech‑savvy audience.',
});
summarizer.destroy();
// 2️⃣ Convert summary to structured output
const lm = await this.createLanguageModelSession();
const result = await lm.prompt(summary, { responseConstraint: schema });
lm.destroy();
// 3️⃣ Parse the JSON response
const parsed = JSON.parse(result);
return parsed?.items ?? [];
}
The schema (shown later) forces the Prompt API to return a JSON object with a type of "bullet_list" and an items array of strings.
4. Full Service Code
import { Injectable } from '@angular/core';
declare const Summarizer: any;
declare const LanguageModel: any;
const schema = {
type: 'object',
required: ['type', 'items'],
properties: {
type: {
type: 'string',
enum: ['bullet_list'],
description: 'Identifies the content as a bullet list'
},
items: {
type: 'array',
minItems: 1,
items: {
type: 'string',
minLength: 1
},
description: 'Each entry is one bullet item, without bullet symbols'
}
},
additionalProperties: false
};
@Injectable({
providedIn: 'root'
})
export class AiService {
async generateTlDr(content: string): Promise {
const summarizer = await this.createSummarizerSession();
const summary = await summarizer.summarize(content, {
context: 'This article is intended for a tech‑savvy audience.',
});
summarizer.destroy();
const lm = await this.createLanguageModelSession();
const result = await lm.prompt(summary, { responseConstraint: schema });
lm.destroy();
const parsed = JSON.parse(result);
return parsed?.items ?? [];
}
private async createSummarizerSession() {
return await Summarizer.create({
type: 'key-points',
length: 'short',
expectedInputLanguages: ['en'],
outputLanguage: 'en',
expectedContextLanguages: ['en'],
sharedContext: 'About AI and Agentic AI'
});
}
private async createLanguageModelSession() {
return await LanguageModel.create({
initialPrompts: [
{
role: 'system',
content: 'Convert these bullets into HTML. Return only the HTML.'
}
],
});
}
}
At this point the heavy lifting is done.
5. Wire the Service into a Component
Component TypeScript
import { Component, inject, OnInit, signal } from '@angular/core';
import { AiService } from './ai.service';
@Component({
selector: 'app-post',
templateUrl: './post.component.html',
styleUrls: ['./post.component.scss']
})
export class PostComponent implements OnInit {
private aiService = inject(AiService);
public readonly postContent = content; // ([]);
async ngOnInit() {
const tldr = await this.aiService.generateTlDr(this.postContent);
this.tltrContent.set(tldr);
}
}
Component Template (post.component.html)
<h3>TL;DR</h3>
@if (tltrContent().length > 0) {
<ul>
@for (item of tltrContent(); track $index) {
<li>{{ item }}</li>
}
</ul>
} @else {
Loading...
}
The TL;DR list will be rendered once the local AI finishes processing.
6. Result Preview
Below is a real example generated entirely on‑device (no external tokens, no cost).

7. Support & Stability
| API | Stability | Role in this example |
|---|---|---|
| Summarizer API | Stable | Generates the key‑point summary |
| Prompt API | Experimental | Formats bullets into HTML (or JSON) |

