🪣 AWS 123: Data in Motion - Migrating S3 Buckets via AWS CLI
Source: Dev.to

🔄 Efficient Data Migration: S3 Sync Strategies
Hey Cloud Builders 👋
Welcome to Day 23 of the #100DaysOfCloud Challenge!
Today, the Nautilus DevOps team is tackling a high‑stakes data migration. We need to move a substantial amount of data from an old bucket to a brand‑new one while ensuring 100 % data consistency using the power of the AWS CLI.

By using the sync command instead of a simple cp (copy), we can ensure that our migration is both fast and accurate.
🎯 Objective
- Create a new private S3 bucket named
devops-sync-19208. - Migrate all data from
devops-s3-12582to the new bucket. - Verify that both buckets are perfectly synchronized.
All actions must be performed exclusively via the AWS CLI.
💡 Why S3 Sync Over Copy?
While aws s3 cp is great for single files, aws s3 sync is the professional choice for migrations because it:
- Recursively copies new and updated files.
- Compares file sizes and modification times to avoid redundant transfers.
- Is idempotent – running it multiple times only copies what has changed.
🔹 Key Concepts
- Sync Command – Recursively copies new and updated files from the source to the destination, skipping unchanged objects.
- Private by Default – New buckets should remain private unless a specific public access requirement exists.
- Data Integrity – After migration, use listing commands to ensure object counts and total sizes match.
🛠️ Step‑by‑Step: S3 Data Migration
We’ll move logically from Creation → Migration → Verification.
🔹 Phase A: Create the New S3 Bucket
aws s3 mb s3://devops-sync-19208 --region us-east-1
🔹 Phase B: Migrate Data Using Sync
aws s3 sync s3://devops-s3-12582 s3://devops-sync-19208
🔹 Phase C: Verify Data Consistency
Check Source:
aws s3 ls s3://devops-s3-12582 --recursive --human-readable --summarize
Check Destination:
aws s3 ls s3://devops-sync-19208 --recursive --human-readable --summarize
✅ Verify Success
If the “Total Objects” and “Total Size” values match in both command outputs, mission accomplished! Your data has been migrated without any loss.
📝 Key Takeaways
syncis idempotent – you can run it repeatedly; only changed files are transferred.- Permissions – Ensure the CLI user has
s3:ListBucketands3:GetObjecton the source, ands3:PutObjecton the destination. - Cross‑Region – Buckets in different AWS regions can be synced without extra configuration.
🚫 Common Mistakes
- Missing the
s3://prefix – Always include it before the bucket name. - Trailing slashes – Be careful with slashes at the end of bucket names; they can affect folder nesting.
- Bucket name uniqueness – S3 bucket names must be globally unique.
🌟 Final Thoughts
You’ve just executed a fundamental DevOps task: Data Reliability. Mastering the AWS CLI for S3 enables you to automate backups, website deployments, and large‑scale data transfers with a single line of code. This skill is essential for:
- Disaster Recovery (DR) setups
- Moving from development to production environments
- Periodic data archival