How I Found $300,000 Worth of Secrets in a Download Button

Published: (February 4, 2026 at 03:05 AM EST)
5 min read
Source: Dev.to

Source: Dev.to

Introduction

It started, as most disasters do, with mild curiosity and a free afternoon.

I downloaded an application. Not because I’m a hacker. Not because I’m conducting corporate espionage. Not because I have any idea what I’m doing. I downloaded it because I wanted to use it.
Revolutionary concept, I know.

The installer was a .exe file. For the uninitiated, this is the software equivalent of a wrapped gift. And like any gift from a stranger on the internet, I decided to unwrap it.

“What’s inside?” I wondered, the way a child wonders what’s inside a clock before destroying it with a hammer.

Every modern desktop application, it turns out, is just a website pretending to be software. It’s like finding out your “homemade” meal came from a freezer bag—technically real, philosophically disappointing.

This particular application was built with Electron, which means somewhere inside was a file called app.asar. Think of it as a zip file that really, really wants you to think it’s not a zip file.

npx asar extract app.asar ./unpacked

Inside was JavaScript—thousands of lines of minified, obfuscated code that looked like someone had sneezed on a keyboard and called it architecture. And there, sitting in the open like a wallet on a park bench, was a .env file.

For those blissfully unaware, a .env file is where developers store secrets: API keys, database credentials, the sort of things you absolutely, positively, under no circumstances should ship to production.

It’s Security 101. Literally. It’s the first thing they teach you:

🚨 Rule #1 of Software Development: Don’t commit your .env file.

This is not advanced knowledge. This is not arcane wisdom passed down through generations of security researchers. This is the “wash your hands after using the bathroom” of software development. And yet… there it was. Gleaming. Unencrypted. Full of credentials.

I won’t name names. I won’t point fingers. I’ll simply describe what I found, in the same way a nature documentary describes a lion eating a gazelle: with clinical detachment and mild horror.

Discovery

ItemSeverityMy Reaction
API Keys🔴 CriticalMultiple. Active. Expensive.
Infrastructure URLs🔴 CriticalInternal endpoints. Very not public.
Service Credentials🟠 HighAnalytics logging everything.
ML Inference Endpoints🟠 HighCloud GPUs go brrrr on their dime.

The total potential exposure? Let’s just say it was significant enough that I briefly considered a career change.

Now, here’s where it gets personal.

The engineer who shipped this? Based on industry averages, location, and the general state of the tech job market, they’re probably making around $300 000 a year.

Three. Hundred. Thousand. Dollars.

To do the software equivalent of leaving your house keys under the doormat—except the doormat is see‑through and you’ve put up a sign that says “KEYS UNDER HERE.”

I’m not bitter. I’m not bitter at all. I am simply noting, for the record, that I—a person of humble curiosity—managed to find this in approximately 45 minutes of casual investigation while eating leftover pizza. Meanwhile, somewhere, a senior software engineer is collecting stock options.

📍 Plot twist: The pizza was cold. The credentials were not.

Having found the obvious vulnerabilities, I did what any responsible researcher would do: I kept looking.

The JavaScript bundle was minified, but minification is obfuscation in the same way a trench coat is a disguise. It technically conceals things, but anyone who looks for more than five seconds can see what’s underneath. I found:

  • 🗂️ Source‑map hints pointing to internal repositories
  • 🐛 Debug symbols that should have been stripped
  • 📋 Hard‑coded configuration copy‑pasted from dev
  • 📡 gRPC definitions outlining the entire API structure

Each discovery was like opening a nesting doll, except instead of smaller dolls, it was smaller failures.

If you’ve made it this far, you might be expecting a dramatic conclusion: a confrontation with the company, a bug‑bounty payout, a heartfelt apology from a CEO. Instead, I’ll offer you something more valuable: a lesson.

Your build pipeline is not a security feature.
Electron apps are zip files with extra steps.
Minification is not encryption.

And for the love of all that is holy, check what you’re shipping before you ship it.

Quick “Before‑You‑Ship” Checklist

# Extract the asar archive
npx asar extract your-app.asar ./check-this

# Grep for obvious secrets
grep -r "API_KEY\|SECRET\|PASSWORD" ./check-this

That engineer you’re paying $300 000? Maybe budget $50 for a security audit. I’ll do it. I’m available. I have pizza.

Every application you download is a mystery box. The mystery is usually “how badly is my data being handled?” The answer is usually “badly.”

Disclaimers

  • I didn’t exploit anything.
  • I didn’t access systems I wasn’t supposed to.
  • I only looked at what was shipped to me, as a user, in an application I downloaded from the official website.

Everything I found was sitting in a package that anyone with fifteen minutes and a search engine could have extracted. The only sophisticated tool I used was npm and a vague sense of disbelief.

This article names no names. It points no fingers that haven’t already been pointed by the act of shipping credentials in a desktop application.

I still use the application. It’s actually quite good. I just use it with the quiet knowledge that somewhere, in a data centre, there’s a server running endpoints I wasn’t supposed to know about, processing requests through an API I could technically call, protected by credentials that are sitting in my Downloads folder.

The download button that started all this sits innocently on their website, cheerfully inviting users to install their app. Beneath it, there should probably be a disclaimer:

“By downloading this software, you agree to receive a free education in application security.”

If You Ship Electron Apps, Please Check

  • No .env files are included in the production bundle
  • No hard‑coded secrets (API keys, passwords, tokens) are present
  • Source maps are stripped or private
  • Debug symbols are removed
  • All third‑party dependencies are up‑to‑date and vetted

Checklist for Your Build

  • No hard‑coded API keys
  • No internal URLs exposed
  • No debug symbols in production
  • Source maps are not included
  • You’ve actually inspected your .asar file

If you discovered credentials in your own app while reading this, you’re welcome.

If you’re the $300k engineer who shipped this… we should talk.

The author is a security researcher in the same way that someone who finds a wallet on the ground is a “detective.”

DMs are open. Pizza recommendations are welcome.

Back to Blog

Related posts

Read more »

Security Is About Failing Safely

Systems will fail. People will make mistakes. Security isn’t about pretending otherwise. It’s about making sure failure is survivable. Good security design: - l...

We Used To Build Things. What Happened?

Article URL: https://garryslist.org/posts/we-used-to-build-things-what-happened Comments URL: https://news.ycombinator.com/item?id=46893394 Points: 14 Comments:...