🪣 AWS 123: Data in Motion - Migrating S3 Buckets via AWS CLI

Published: (January 2, 2026 at 10:52 PM EST)
3 min read
Source: Dev.to

Source: Dev.to

AWS

🔄 Efficient Data Migration: S3 Sync Strategies

Hey Cloud Builders 👋

Welcome to Day 23 of the #100DaysOfCloud Challenge!

Today, the Nautilus DevOps team is tackling a high‑stakes data migration. We need to move a substantial amount of data from an old bucket to a brand‑new one while ensuring 100 % data consistency using the power of the AWS CLI.

Data Migration Diagram

By using the sync command instead of a simple cp (copy), we can ensure that our migration is both fast and accurate.

🎯 Objective

  • Create a new private S3 bucket named devops-sync-19208.
  • Migrate all data from devops-s3-12582 to the new bucket.
  • Verify that both buckets are perfectly synchronized.
    All actions must be performed exclusively via the AWS CLI.

💡 Why S3 Sync Over Copy?

While aws s3 cp is great for single files, aws s3 sync is the professional choice for migrations because it:

  • Recursively copies new and updated files.
  • Compares file sizes and modification times to avoid redundant transfers.
  • Is idempotent – running it multiple times only copies what has changed.

🔹 Key Concepts

  • Sync Command – Recursively copies new and updated files from the source to the destination, skipping unchanged objects.
  • Private by Default – New buckets should remain private unless a specific public access requirement exists.
  • Data Integrity – After migration, use listing commands to ensure object counts and total sizes match.

🛠️ Step‑by‑Step: S3 Data Migration

We’ll move logically from Creation → Migration → Verification.

🔹 Phase A: Create the New S3 Bucket

aws s3 mb s3://devops-sync-19208 --region us-east-1

🔹 Phase B: Migrate Data Using Sync

aws s3 sync s3://devops-s3-12582 s3://devops-sync-19208

🔹 Phase C: Verify Data Consistency

Check Source:

aws s3 ls s3://devops-s3-12582 --recursive --human-readable --summarize

Check Destination:

aws s3 ls s3://devops-sync-19208 --recursive --human-readable --summarize

✅ Verify Success

If the “Total Objects” and “Total Size” values match in both command outputs, mission accomplished! Your data has been migrated without any loss.

📝 Key Takeaways

  • sync is idempotent – you can run it repeatedly; only changed files are transferred.
  • Permissions – Ensure the CLI user has s3:ListBucket and s3:GetObject on the source, and s3:PutObject on the destination.
  • Cross‑Region – Buckets in different AWS regions can be synced without extra configuration.

🚫 Common Mistakes

  • Missing the s3:// prefix – Always include it before the bucket name.
  • Trailing slashes – Be careful with slashes at the end of bucket names; they can affect folder nesting.
  • Bucket name uniqueness – S3 bucket names must be globally unique.

🌟 Final Thoughts

You’ve just executed a fundamental DevOps task: Data Reliability. Mastering the AWS CLI for S3 enables you to automate backups, website deployments, and large‑scale data transfers with a single line of code. This skill is essential for:

  • Disaster Recovery (DR) setups
  • Moving from development to production environments
  • Periodic data archival
Back to Blog

Related posts

Read more »

How to Deploy AWS ELASTIC BEANSTALK

Overview AWS Elastic Beanstalk is a managed cloud service that lets you deploy and run web applications without worrying about the underlying infrastructure. I...

A Guide to AWS CloudFront Invalidations

!Cover image for A Guide to AWS CloudFront Invalidationshttps://media2.dev.to/dynamic/image/width=1000,height=420,fit=cover,gravity=auto,format=auto/https%3A%2F...

Launch an AWS EC2 Instance

Introduction This guide walks you through launching an AWS EC2 instance, installing Docker, and running NGINX inside a Docker container. By the end you will ha...