I almost leaked an API key into ChatGPT, so I built a Chrome extension

Published: (March 10, 2026 at 04:36 PM EDT)
1 min read
Source: Dev.to

Source: Dev.to

Introduction

I use AI chat tools a lot when coding and analyzing logs. A few days ago I almost pasted a real API key into ChatGPT while sharing some logs, but I noticed it just before sending.

The Problem

It’s extremely easy to accidentally leak sensitive data when using AI chats.

The Solution: PasteSafe

I built a small Chrome extension called PasteSafe. When you paste text into AI chats, it scans the content and detects things like:

  • API keys
  • Emails
  • Phone numbers
  • IBANs
  • UUIDs
  • URLs

If something sensitive is detected, PasteSafe automatically masks the values before the message is sent.

Supported AI Chats

  • ChatGPT
  • Claude
  • Gemini

How It Works

Everything runs locally in the browser:

  • No servers
  • No tracking
  • No data collection

Try It

PasteSafe – Chrome Extension

Source Code

GitHub repository

Discussion

Have you ever accidentally pasted something sensitive into an AI chat?

0 views
Back to Blog

Related posts

Read more »

AI Agents Took Over My App

Background I asked AI to help grow my platform. It created 1,000 engineers and did it itself—all within two weeks. I built a platform that helps engineers nail...

The Internet Made Us Observers

There was a time when moments simply happened. You went to a concert. And that was it; Then the internet changed something. Not suddenly. Now when something int...