AcademyBackup Your Files! 10x Speed Upload to AWS S3 (Tested 2Gbps)
TutorialBeginner

Backup Your Files! 10x Speed Upload to AWS S3 (Tested 2Gbps)

Super easy guide to backing up files to AWS S3 Deep Archive with optimized multipart upload speeds.

November 14, 2025
Cyrus 365
1 min read
AWSS3BackupCLI

This guide covers how to upload both single files and entire folders to AWS S3 using the AWS CLI. We'll focus on sending them directly to the DEEP_ARCHIVE storage class for inexpensive, long-term backup.

Step 1: Prerequisites ⚙️

This guide assumes you have already installed and configured the AWS CLI. If you haven't, please complete our first guide:

➡️ How to Install & Configure the AWS CLI

Step 2: Optimize Upload Speed (Optional but Recommended) ⚡

For large files (e.g., videos, large archives), you can increase speed by configuring multipart uploads. This allows the CLI to upload multiple parts of a file in parallel.

bash
aws configure set default.s3.max_concurrent_requests 100

This is a one-time configuration that will speed up all subsequent large uploads. 🚀

bash
aws configure set default.s3.multipart_chunksize 64MB

Step 3: Upload a Single File ⬆️

Use the aws s3 cp (copy) command to upload files. For example, to upload my-backup.zip from your Desktop to a bucket named my-super-safe-bucket and send it directly to DEEP_ARCHIVE:

bash
aws s3 cp "C:\Users\YourName\Desktop\my-backup.zip" s3://my-super-safe-bucket/ --storage-class DEEP_ARCHIVE

Step 4: Upload an Entire Folder 📂

To back up an entire folder (e.g., 'My Photos'), use the same cp command with the --recursive flag:

bash
aws s3 cp "C:\Users\YourName\Documents\MyPhotos" s3://my-super-safe-bucket/MyPhotos/ --recursive --storage-class DEEP_ARCHIVE

Step 5: Verify the Upload ✅

bash
aws s3 ls s3://my-super-safe-bucket/ --recursive

Your newly-uploaded files and folders will appear in the list. ✨