AWS S3 vs Google Cloud Storage

Complete command reference with your specific paths

Path Reference
Ubuntu: /var/www/mainweb.com/public/upload/
Windows: C:\Users\akashkumar\Downloads\images
AWS S3: s3://u18-evolute-library/upload/
Google Cloud: gs://u18-evolute-library/upload/

Setup & Configuration

Amazon S3 (AWS CLI)
Installation & Configuration
# Ubuntu installation
$ sudo apt update
$ sudo apt install awscli -y $ aws --version
# Configure AWS CLI
$ aws configure
AWS Access Key ID [None]: YOUR_ACCESS_KEY
AWS Secret Access Key [None]: YOUR_SECRET_KEY
Default region name [None]: ap-south-1
Default output format [None]: json
# Verify configuration
$ aws sts get-caller-identity
Google Cloud Storage (gsutil)
Installation & Configuration
# Install Google Cloud SDK
$ curl https://sdk.cloud.google.com | bash
$ exec -l $SHELL
OR
$ gcloud --version
$ sudo apt install google-cloud-cli -y
# Configure gcloud and gsutil
$ gcloud auth login
$ gcloud projects list
$ gcloud config set project PROJECT_ID
# Verify configuration
$ gsutil ls

Basic Operations

Basic Bucket Operations
# Create bucket
$ aws s3 mb s3://u18-evolute-library --region ap-south-1
Create a new bucket in specific region
# List buckets
$ aws s3 ls
List all buckets in your account
# List bucket contents
$ aws s3 ls s3://u18-evolute-library/
List objects in a bucket
# Delete empty bucket
$ aws s3 rb s3://u18-evolute-library
Remove an empty bucket
$ aws s3 rb s3://bucket-name --force
rb fails if bucket is not empty. You might add --force
Basic Bucket Operations
# Create bucket
$ gsutil mb -l asia-south1 gs://u18-evolute-library
Create a new bucket in specific location
# List buckets
$ gsutil ls
List all buckets in your project
# List bucket contents
$ gsutil ls gs://u18-evolute-library/
List objects in a bucket
# Delete empty bucket
$ gsutil rb gs://u18-evolute-library
Remove an empty bucket

Intermediate Operations

Intermediate File Operations
# Single file upload
$ aws s3 cp local-file.txt s3://bucket/path/
Upload a single file

Ubuntu
$ aws s3 cp /var/www/mainweb.com/public/upload/logo.png s3://u18-evolute-library/upload/

Windows
$ aws s3 cp "C:\Users\akashkumar\Downloads\images\logo.png" s3://u18-evolute-library/upload/
# Recursive upload (Upload All Files)
$ aws s3 cp /var/www/mainweb.com/public/upload/ s3://u18-evolute-library/upload/ --recursive
$ aws s3 cp "C:\Users\akashkumar\Downloads\images" s3://u18-evolute-library/upload/ --recursive
Upload folder recursively

# Recursive upload with public access
$ aws s3 cp /var/www/mainweb.com/public/upload/ s3://u18-evolute-library/upload/ --recursive --acl public-read
Upload folder recursively with public read access

# Make single file public read access
$ aws s3api put-object-acl --bucket my-bucket --key path/file.txt --acl public-read
Make single file public read access
# Download file
$ aws s3 cp s3://u18-evolute-library/upload/logo.png local-file.txt
Download a file from S3
# Upload specific file types
$ aws s3 cp /var/www/mainweb.com/public/upload/ s3://u18-evolute-library/upload/ --recursive --exclude "*" --include "*.jpg" --include "*.png" --include "*.mp4"
# Delete file
$ aws s3 rm s3://u18-evolute-library/upload/logo.png
Delete a specific file
Intermediate File Operations
# Single file upload
$ gsutil cp /var/www/mainweb.com/public/upload/* gs://u18-evolute-library/upload/
$ gsutil cp "C:\Users\akashkumar\Downloads\images\*" gs://u18-evolute-library/upload/
Upload a single file

Ubuntu
$ gsutil cp /var/www/mainweb.com/public/upload/logo.png gs://u18-evolute-library/upload/

Windows
$ gsutil cp C:\Users\akashkumar\Downloads\images\logo.png gs://u18-evolute-library/upload/
# Recursive upload
$ gsutil cp -r /var/www/mainweb.com/public/upload/ gs://u18-evolute-library/upload/
Upload folder recursively

# Make files public
$ gsutil -m acl ch -u AllUsers:R gs://u18-evolute-library/upload/**
# Make single file public
$ gsutil acl ch -u AllUsers:R gs://bucket/file.txt
# Download file
$ gsutil cp gs://u18-evolute-library/upload/file.txt local-file.txt
Download a file from GCS
# Upload specific file types
$ gsutil -m cp /var/www/mainweb.com/public/upload/*.{jpg,png,gif} gs://u18-evolute-library/upload/
gsutil -m cp /var/www/mainweb.com/public/upload/*.mp4 gs://u18-evolute-library/upload/videos/
# Delete file
$ gsutil rm gs://u18-evolute-library/upload/file.txt
Delete a specific file

Advanced Operations

Advanced Sync & Management
# Sync directories
$ aws s3 sync /var/www/mainweb.com/public/upload/ s3://u18-evolute-library/upload/
Sync local directory to S3
# Sync with deletion
$ aws s3 sync /var/www/mainweb.com/public/upload/ s3://u18-evolute-library/upload/ --delete
Sync and delete remote files not present locally
# Preserve permissions
$ aws s3 sync /var/www/mainweb.com/public/upload/ s3://u18-evolute-library/upload/ --acl bucket-owner-full-control
Sync with specific ACL permissions
# Generate presigned URL
$ aws s3 presign s3://u18-evolute-library/upload/logo.png --expires-in 3600
Generate a temporary URL (expires in 1 hour)
Advanced Sync & Management
# Sync directories
$ gsutil -m rsync -r /var/www/mainweb.com/public/upload/ gs://u18-evolute-library/upload/
Sync local directory to GCS

# Make files public
$ gsutil -m acl ch -u AllUsers:R gs://u18-evolute-library/upload/**
# Sync with deletion
$ gsutil -m rsync -r -d /var/www/mainweb.com/public/upload/ gs://u18-evolute-library/upload/
Sync and delete remote files not present locally
# Set permissions
$ gsutil -m acl ch -u AllUsers:R gs://u18-evolute-library/upload/file.txt
Make a file publicly readable
$ gsutil -m acl ch -u AllUsers:R gs://u18-evolute-library/upload/**
Make all files publicly readable
# Generate signed URL
$ gsutil signurl key-file gs://u18-evolute-library/upload/file.txt
Generate a signed URL for temporary access
$ $ gsutil signurl -d 1h service-account.json gs://bucket/file.txt
Generate a signed URL for temporary access for 1hr

Pro Commands

Pro AWS S3 Advanced Features
# Enable website hosting
$ aws s3 website s3://u18-evolute-library --index-document index.html
Configure bucket for static website hosting
# Set lifecycle policy
$ aws s3api put-bucket-lifecycle-configuration --bucket bucket-name --lifecycle-configuration file://policy.json
Apply lifecycle rules from JSON file
# Cross-region replication
$ aws s3api put-bucket-replication --bucket source-bucket --replication-configuration file://replication.json
Configure replication to another region
# Batch operations
$ aws s3control create-job --account-id 123456789012 --operation '{"S3PutObjectCopy": {...}}' --manifest {...}
Create a batch operation job
# Automation operations
$ 0 * * * * aws s3 sync "/var/www/mainweb.com/public/upload/" s3://u18-evolute-library/upload/ --acl public-read
Create a automation operation job
Pro Tip: Use S3 Transfer Acceleration for faster uploads to distant regions by adding --endpoint-url http://s3-accelerate.amazonaws.com
Pro Google Cloud Storage Advanced Features
# Set lifecycle policy
$ gsutil lifecycle set lifecycle.json gs://u18-evolute-library
Apply lifecycle rules from JSON file
# Enable versioning
$ gsutil versioning set on gs://u18-evolute-library
Enable object versioning for the bucket
# Change storage class
$ gsutil rewrite -s coldline gs://u18-evolute-library/object
Change object storage class to Coldline
# Parallel composite uploads
$ gsutil -o GSUtil:parallel_composite_upload_threshold=50M cp largefile.gz gs://u18-evolute-library/
Enable parallel uploads for large files
# Automation
$ 0 * * * * gsutil -m rsync -r "/var/www/mainweb.com/public/upload/" gs://u18-evolute-library/upload && gsutil -m acl ch -u AllUsers:R gs://u18-evolute-library/upload/
automation operation job
Pro Tip: Use the -m flag for multi-threaded operations to significantly speed up transfers: gsutil -m cp -r large-dir gs://bucket/
Warning: Always test commands with the --dryrun option first when performing destructive operations like sync with delete.

Automation Scripts

Windows Batch Script (sync-to-cloud.bat)
@echo off
echo Starting sync to AWS S3...
aws s3 sync "C:\Users\akashkumar\Downloads\images" s3://u18-evolute-library/upload/ --delete
echo.
echo Starting sync to Google Cloud Storage...
gsutil -m rsync -r -d "C:\Users\akashkumar\Downloads\images" gs://u18-evolute-library/upload
echo.
echo Sync completed!
pause
Ubuntu Shell Script (sync-to-cloud.sh)
#!/bin/bash
echo "Starting sync to AWS S3..."
aws s3 sync /var/www/mainweb.com/public/upload/ s3://u18-evolute-library/upload/ --delete
echo ""
echo "Starting sync to Google Cloud Storage..."
gsutil -m rsync -r -d /var/www/mainweb.com/public/upload/ gs://u18-evolute-library/upload
echo ""
echo "Sync completed!"
Windows Backup Script (backup-to-s3.bat)
@echo off
echo Starting backup to AWS S3...
aws s3 sync "C:\Users\akashkumar\Downloads\images" s3://u18-evolute-library/upload/ --delete
if %errorlevel% neq 0 (
  echo Backup failed!
  exit /b 1
)
echo Backup completed successfully!
pause
Linux Backup Script (backup-to-gcs.sh)
#!/bin/bash

# Configuration
LOCAL_DIR="/var/www/mainweb.com/public/upload/"
BUCKET="gs://u18-evolute-library/upload"
LOG_FILE="/var/log/backup.log"

echo "$(date): Starting backup" >> $LOG_FILE
gsutil -m rsync -r -d $LOCAL_DIR $BUCKET >> $LOG_FILE 2>&1

if [ $? -eq 0 ]; then
  echo "$(date): Backup completed successfully" >> $LOG_FILE
else
  echo "$(date): Backup failed" >> $LOG_FILE
  exit 1
fi
Windows Backup Script (backup-to-cloud.bat)
@echo off
echo Starting backup to AWS S3...
aws s3 sync C:\Users\akashkumar\Downloads\images s3://u18-evolute-library/upload/ --delete
if %errorlevel% neq 0 (
  echo AWS backup failed!
  exit /b 1
)
echo.
echo Starting backup to Google Cloud...
gsutil -m rsync -r -d C:\Users\akashkumar\Downloads\images gs://u18-evolute-library/upload
if %errorlevel% neq 0 (
  echo GCP backup failed!
  exit /b 1
)
echo Backup completed successfully!
pause
Linux Backup Script (backup-to-cloud.sh)
#!/bin/bash

# Configuration
LOCAL_DIR="/var/www/mainweb.com/public/upload/"
AWS_BUCKET="s3://u18-evolute-library/upload/"
GCP_BUCKET="gs://u18-evolute-library/upload"
LOG_FILE="/var/log/cloud-backup.log"

echo "$(date): Starting backup" >> $LOG_FILE

# AWS Backup
aws s3 sync $LOCAL_DIR $AWS_BUCKET --delete >> $LOG_FILE 2>&1
if [ $? -eq 0 ]; then
  echo "$(date): AWS backup completed" >> $LOG_FILE
else
  echo "$(date): AWS backup failed" >> $LOG_FILE
  exit 1
fi

# GCP Backup
gsutil -m rsync -r -d $LOCAL_DIR $GCP_BUCKET >> $LOG_FILE 2>&1
if [ $? -eq 0 ]; then
  echo "$(date): GCP backup completed" >> $LOG_FILE
else
  echo "$(date): GCP backup failed" >> $LOG_FILE
  exit 1
fi

echo "$(date): All backups completed successfully" >> $LOG_FILE
Note: For production environments, consider using IAM roles (AWS) or Service Accounts (GCP) instead of access keys for better security.

Pro Tips

  • Use the -m flag with gsutil for parallel operations (faster transfers)
  • Always test sync commands with --dryrun option first to see what will be transferred
  • For production environments, consider using IAM roles (AWS) or Service Accounts (GCP) instead of access keys
  • Set up lifecycle policies to automatically manage object expiration and storage class transitions