My attempt on Cloud Resume Challenge in 2026
Source: Dev.to
Table of Contents
- About the project
- My suggestions before getting started
- How much time I spent
- Reflection
- Project Architecture
- Lambda Function
- DynamoDB
- API Gateways
- API Throttling
- Git Repo
- More to come
- Development Observations
- Terraform (IaC)
- Security
- Git Repositories
- What’s Next
About the project
Cloud Resume Challenge is a project designed to help you get started in the cloud field if you aspire to become a Cloud/DevOps engineer. The challenge emphasizes real‑world experience, and the end goal is to build something you will actually use—not a toy project.
The project itself is a resume website with a page‑view counter.
You can learn more on the official site and also here, where the creator explains the motivation and goals behind the challenge in more detail.
My suggestions before getting started
If you have limited time and want some practical advice before starting, here are my two cents:
- Skip the HTML/CSS part – it doesn’t really matter what the site looks like before you put it on your resume. The real value lies in the cloud architecture and implementation.
- Start with the web console if you’re not very familiar with AWS (or the cloud provider you choose) instead of jumping straight to Infrastructure as Code (IaC). This lets you move faster and get a working demo early. When you later switch to IaC, you’ll better understand which configurations were implicitly handled by the console.
- Be conservative with the time you spend on each topic. There’s a lot to cover in this project, and there’s always more to explore. Focus on understanding the overall architecture rather than getting stuck in low‑level details.
How much time I spent
I was a frontend engineer transitioning into DevOps. I spent less than a week on this challenge, completing all requirements on the website.
To be honest, the absolute amount of time spent is meaningless. It correlates with how deep someone wants to go and how comprehensive their solution is. Instead, here’s the rough ratio of time I spent on each part:
| Area | Approx. % of total time |
|---|---|
| HTML/CSS | 0–1 % (used Next.js for a static site) |
| Web console setup | ~30 % (creating a fully working demo, learning API Gateway, Lambda, etc.) |
| IaC + CI/CD | 50–60 % (trial‑and‑error, hidden console configurations) |
| Blog writing / reflection | ~10 % (the most important part, in my opinion) |
Reflection
Spoiler alert: Please stop reading if you haven’t completed the challenge yet, as I’m about to reveal my solution.
Project Architecture
The domain name (janice-zhong.com) resolves to a CloudFront distribution, which points to an S3 bucket with static hosting. The S3 bucket serves the website content, which makes calls to API Gateways that trigger Lambda functions to read from and write into DynamoDB.
Two TLS certificates are provisioned:
- In us‑east‑1 for CloudFront
- In ap‑south‑east‑2 for the API Gateways

Lambda Function
Lambda is a managed compute service that lets you run functions on AWS without managing servers. Important points:
- Idempotency – each Lambda should produce the same result when invoked multiple times.
- Statelessness – concurrent invocations start with a fresh environment; you should assume no state persists between invocations.
DynamoDB
Serial writes at the item level
My first mistake was assuming that concurrent Lambda functions would cause race conditions when updating the same record. I initially inserted a new record for each page visit and then counted the total number of records. This quickly became problematic because Scan operations are O(n), which is inefficient for a simple page‑view counter.
What I learned: DynamoDB provides serial writes at the item level by default, so you can safely increment a counter on a single item.
Prevent throttled concurrent writes via client retries
DynamoDB operates on RCU (Read Capacity Units) and WCU (Write Capacity Units). In on‑demand mode you get up to 4,000 writes/second. If you exceed this limit, DynamoDB returns a ProvisionedThroughputExceededException. Handling this exception with exponential back‑off retries on the client side mitigates throttling.
API Gateways
I’ll omit many details to keep this post concise, but one key topic worth highlighting is throttling in API Gateways.
API Throttling
(Details about throttling limits, burst capacity, and how to configure usage plans would go here.)
Git Repo
Link to the repository (replace # with the actual URL)
More to come
Stay tuned for deeper dives into:
- CI/CD pipelines
- Monitoring & alerting
- Cost optimisation
All headings, lists, and links have been standardized while preserving the original content.
Development Observations
During development I noticed a large amount of bot traffic hitting my CloudFront distribution. This prompted me to set up throttling in API Gateway.
API Gateway Throttling
API Gateway throttling can be applied at three levels:
| Level | Description |
|---|---|
| Account (default) | Global defaults for the whole account |
| Stage | Applied to a specific deployment stage |
| Route | Fine‑grained per‑method limits |
I applied throttling at the stage level and configured:
- Burst: 50
- Limit: 200
For more advanced per‑IP throttling, AWS WAF can be used.
Terraform (IaC)
Remote State + Locking
- State files should never be committed to version control to avoid credential leaks and state corruption.
- Initially I stored the state in an S3 bucket, then migrated to HCP Terraform for better management.
Always Pin Versions
Pin provider and module versions to prevent unexpected breaking changes from upstream updates.
Use HCP Terraform for Automated Terraforming (CI/CD)
- HCP Terraform provides audit logs and manual‑approval workflows out‑of‑the‑box, without extra configuration compared to GitHub Actions.
- This significantly reduces operational overhead for Terraform‑based CI/CD pipelines.
Security
OIDC
OpenID Connect (OIDC) allows GitHub Actions workflows to assume an IAM role in your AWS account and obtain short‑lived credentials.
- Eliminates long‑lived access keys stored in GitHub secrets.
- Reduces the operational burden of credential rotation.
TLS
TLS encrypts traffic between clients and servers.
- Two TLS certificates are provisioned:
- us‑east‑1 – for CloudFront
- ap‑south‑east‑2 – for the API Gateways
Adding a Human‑Verification Process
The high volume of bot traffic raised concerns that legitimate users might be blocked by the throttling limits. Two options are available:
| Option | Description | Cost |
|---|---|---|
| CloudFront + AWS WAF | Native AWS solution; includes a managed bot‑control rule group that processes the first 1 million web requests per month and handles CAPTCHA/challenge attempts. | $10 / month (prorated hourly) |
| Cloudflare Turnstile | Free alternative that provides the same human‑verification functionality. | Free |
A tutorial for the AWS WAF approach can be found here.
Git Repositories
- Frontend:
- IaC:
What’s Next
I have the basic workflows running and will soon tackle the Cloud Resume Challenge book to implement optional add‑on challenges.
Stay tuned!
