Another ETL: Night Lift Tickets

Published: (December 22, 2025 at 02:00 PM EST)
4 min read
Source: Dev.to

Source: Dev.to

Background

I live in Chicago, and one thing I like about the winter is having the chance to go snowboarding. I’m not very good at it, but I enjoy it. Usually I would open multiple websites until I find a place to go. Whenever I find myself doing a repetitive manual task—opening several sites with similar content for the same purpose—my brain goes directly to ETL.

I’m interested in night lift tickets because they have the best price, and I want to see the places on a map to find out how far they are from my home.

ETL Structure

All of my ETLs share the same structure:

  • Extract – fetch the HTML of the target page.
  • Transform – parse the HTML and extract the price.
  • Load – save the result to a JSON file.

The extract and load steps are identical for every ski resort, so they live in a shared common.js file. The only custom part is the transform function, which uses a CSS selector to locate the price in the HTML.

Transform (Node.js)

// transform.js
const cheerio = require("cheerio");
const { extract, load, loggerInfo } = require("./common");

async function transform(html) {
  const $ = cheerio.load(html);

  const price = $(
    $(".seasoncontainer.liftcontainer.clsDaynight.clsNight li")[2]
  )
    .find("p")
    .text()
    .replace("$", "");

  return {
    price: parseInt(price, 10),
  };
}

async function main() {
  const place = {
    id: "cascademountain",
    website: "https://www.cascademountain.com",
    name: "Cascade Mountain",
    lat: 43.502728,
    lng: -89.515996,
    gmaps: "https://maps.app.goo.gl/YWdnQvZiJZwPhj79A",
    url: "https://www.cascademountain.com/lift-tickets/",
  };

  loggerInfo("etl start", { id: place.id });

  const html = await extract(place.url);
  const data = await transform(html);
  await load(place, data);

  loggerInfo("etl done", { id: place.id });
}

main().then(() => {});

The CSS selector (.seasoncontainer.liftcontainer.clsDaynight.clsNight li) works for the current site, but if the classes change the scraper will break. In production you’d add an alert for extraction failures, or even use an AI model (e.g., Gemini) to locate the price.

Extract Function

// common.js (excerpt)
async function extract(url) {
  loggerInfo("extracting", { url });

  const response = await fetch(url);
  const html = await response.text();
  return html;
}

A variation of this function uses Puppeteer for sites that block simple fetch requests.

Load Function

// common.js (excerpt)
const fs = require("fs").promises;

async function load(place, extraData) {
  const data = { ...place, ...extraData };
  loggerInfo("load", data);

  const filename = `public/sites/${data.id}.json`;
  await fs.writeFile(filename, JSON.stringify(data, null, 2));

  loggerInfo("saved", { filename });
}

For now the data is stored as a JSON file in public/sites/. In a production environment a database such as DynamoDB would be more appropriate.

Logging

// common.js (excerpt)
const loggerInfo = (...args) => {
  console.log(...args);
};

loggerInfo is a thin wrapper around console.log. In a real application you could replace it with New Relic, Datadog, or another logging service.

Front‑end with Next.js

After the ETL populates all-places.json, the next step is to build a simple site that displays the resorts on a map.

Prompt Used with Copilot

On this Next.js project, build a page with the following requirements:
- Only one page.
- The page should show a map using Google Maps.
- The map should show a marker for each place found in `all-places.json`, displaying the price.
- The page should have Google Analytics.
- The page should let the user click a marker and open a small card with the place information: name, link to Google Maps, and link to the `url` found in the JSON.
- The UI should be simple and use inline styles.

Copilot generates a starter implementation, which can then be tweaked as needed.

Conclusion

The ETL extracts night‑lift ticket prices, stores them in JSON files, and feeds a minimal Next.js front‑end that visualizes the data on a Google Map. The approach works for a small, static list of ski resorts; scaling up would involve automating the place list (e.g., via Google Maps APIs) and moving the data store to a proper database.

Feel free to check the site and let me know what you think!

Back to Blog

Related posts

Read more »