AWS S3 vs Google Cloud Storage

Complete command reference with your specific paths

Path Reference
Ubuntu: /var/www/mainweb.com/public/upload/
Windows: C:\Users\akashkumar\Downloads\images
AWS S3: s3://cloudallinone-library/upload/
Google Cloud: gs://cloudallinone-library/upload/
Tip: Click on any command to copy it to clipboard. Use Ctrl+K (or Cmd+K on Mac) to quickly search commands.

Setup & Configuration

Pro Tip: Always use IAM roles or service accounts for production environments instead of access keys for better security.
Amazon S3 (AWS CLI)
Installation & Configuration
# Ubuntu installation
$ sudo apt update
$ sudo apt install awscli -y $ aws --version
# Configure AWS CLI
$ aws configure
AWS Access Key ID [None]: YOUR_ACCESS_KEY
AWS Secret Access Key [None]: YOUR_SECRET_KEY
Default region name [None]: ap-south-1
Default output format [None]: json
# Verify configuration
$ aws sts get-caller-identity
Google Cloud Storage (gsutil)
Installation & Configuration
# Install Google Cloud SDK
$ curl https://sdk.cloud.google.com | bash
$ exec -l $SHELL
OR
$ gcloud --version
$ sudo apt install google-cloud-cli -y
# Configure gcloud and gsutil
$ gcloud auth login
$ gcloud projects list
$ gcloud config set project PROJECT_ID
# Verify configuration
$ gsutil ls

Basic Operations

Basic Bucket Operations
# Create bucket
$ aws s3 mb s3://cloudallinone-library --region ap-south-1
Create a new bucket in specific region
# List buckets
$ aws s3 ls
List all buckets in your account
# List bucket contents
$ aws s3 ls s3://cloudallinone-library/
List objects in a bucket
# Delete empty bucket
$ aws s3 rb s3://cloudallinone-library
Remove an empty bucket
$ aws s3 rb s3://bucket-name --force
rb fails if bucket is not empty. You might add --force
Basic Bucket Operations
# Create bucket
$ gsutil mb -l asia-south1 gs://cloudallinone-library
Create a new bucket in specific location
# List buckets
$ gsutil ls
List all buckets in your project
# List bucket contents
$ gsutil ls gs://cloudallinone-library/
List objects in a bucket
# Delete empty bucket
$ gsutil rb gs://cloudallinone-library
Remove an empty bucket

Intermediate Operations

Intermediate File Operations
# Single file upload
$ aws s3 cp local-file.txt s3://bucket/path/
Upload a single file

Ubuntu
$ aws s3 cp /var/www/mainweb.com/public/upload/logo.png s3://cloudallinone-library/upload/

Windows
$ aws s3 cp "C:\Users\akashkumar\Downloads\images\logo.png" s3://cloudallinone-library/upload/
# Recursive upload (Upload All Files)
$ aws s3 cp /var/www/mainweb.com/public/upload/ s3://cloudallinone-library/upload/ --recursive
$ aws s3 cp "C:\Users\akashkumar\Downloads\images" s3://cloudallinone-library/upload/ --recursive
Upload folder recursively

# Recursive upload with public access
$ aws s3 cp /var/www/mainweb.com/public/upload/ s3://cloudallinone-library/upload/ --recursive --acl public-read
Upload folder recursively with public read access

# Make single file public read access
$ aws s3api put-object-acl --bucket my-bucket --key path/file.txt --acl public-read
Make single file public read access
# Download file
$ aws s3 cp s3://cloudallinone-library/upload/logo.png local-file.txt
Download a file from S3
# Upload specific file types
$ aws s3 cp /var/www/mainweb.com/public/upload/ s3://cloudallinone-library/upload/ --recursive --exclude "*" --include "*.jpg" --include "*.png" --include "*.mp4"
# Delete file
$ aws s3 rm s3://cloudallinone-library/upload/logo.png
Delete a specific file
Intermediate File Operations
# Single file upload
$ gsutil cp /var/www/mainweb.com/public/upload/* gs://cloudallinone-library/upload/
$ gsutil cp "C:\Users\akashkumar\Downloads\images\*" gs://cloudallinone-library/upload/
Upload a single file

Ubuntu
$ gsutil cp /var/www/mainweb.com/public/upload/logo.png gs://cloudallinone-library/upload/

Windows
$ gsutil cp C:\Users\akashkumar\Downloads\images\logo.png gs://cloudallinone-library/upload/
# Recursive upload
$ gsutil cp -r /var/www/mainweb.com/public/upload/ gs://cloudallinone-library/upload/
Upload folder recursively

# Make files public
$ gsutil -m acl ch -u AllUsers:R gs://cloudallinone-library/upload/**
# Make single file public
$ gsutil acl ch -u AllUsers:R gs://bucket/file.txt
# Download file
$ gsutil cp gs://cloudallinone-library/upload/file.txt local-file.txt
Download a file from GCS
# Upload specific file types
$ gsutil -m cp /var/www/mainweb.com/public/upload/*.{jpg,png,gif} gs://cloudallinone-library/upload/
gsutil -m cp /var/www/mainweb.com/public/upload/*.mp4 gs://cloudallinone-library/upload/videos/
# Delete file
$ gsutil rm gs://cloudallinone-library/upload/file.txt
Delete a specific file

Advanced Operations

Advanced Sync & Management
# Sync directories
$ aws s3 sync /var/www/mainweb.com/public/upload/ s3://cloudallinone-library/upload/
Sync local directory to S3
# Sync with deletion
$ aws s3 sync /var/www/mainweb.com/public/upload/ s3://cloudallinone-library/upload/ --delete
Sync and delete remote files not present locally
# Preserve permissions
$ aws s3 sync /var/www/mainweb.com/public/upload/ s3://cloudallinone-library/upload/ --acl bucket-owner-full-control
Sync with specific ACL permissions
# Generate presigned URL
$ aws s3 presign s3://cloudallinone-library/upload/logo.png --expires-in 3600
Generate a temporary URL (expires in 1 hour)
Advanced Sync & Management
# Sync directories
$ gsutil -m rsync -r /var/www/mainweb.com/public/upload/ gs://cloudallinone-library/upload/
Sync local directory to GCS

# Make files public
$ gsutil -m acl ch -u AllUsers:R gs://cloudallinone-library/upload/**
# Sync with deletion
$ gsutil -m rsync -r -d /var/www/mainweb.com/public/upload/ gs://cloudallinone-library/upload/
Sync and delete remote files not present locally
# Set permissions
$ gsutil -m acl ch -u AllUsers:R gs://cloudallinone-library/upload/file.txt
Make a file publicly readable
$ gsutil -m acl ch -u AllUsers:R gs://cloudallinone-library/upload/**
Make all files publicly readable
# Generate signed URL
$ gsutil signurl key-file gs://cloudallinone-library/upload/file.txt
Generate a signed URL for temporary access
$ gsutil signurl -d 1h service-account.json gs://bucket/file.txt
Generate a signed URL for temporary access for 1hr

Pro Commands

Pro AWS S3 Advanced Features
# Enable website hosting
$ aws s3 website s3://cloudallinone-library --index-document index.html
Configure bucket for static website hosting
# Set lifecycle policy
$ aws s3api put-bucket-lifecycle-configuration --bucket bucket-name --lifecycle-configuration file://policy.json
Apply lifecycle rules from JSON file
# Cross-region replication
$ aws s3api put-bucket-replication --bucket source-bucket --replication-configuration file://replication.json
Configure replication to another region
# Batch operations
$ aws s3control create-job --account-id 123456789012 --operation '{"S3PutObjectCopy": {...}}' --manifest {...}
Create a batch operation job
# Automation operations
$ 0 * * * * aws s3 sync "/var/www/mainweb.com/public/upload/" s3://cloudallinone-library/upload/ --acl public-read
Create a automation operation job
Pro Tip: Use S3 Transfer Acceleration for faster uploads to distant regions by adding --endpoint-url http://s3-accelerate.amazonaws.com
Pro Google Cloud Storage Advanced Features
# Set lifecycle policy
$ gsutil lifecycle set lifecycle.json gs://cloudallinone-library
Apply lifecycle rules from JSON file
# Enable versioning
$ gsutil versioning set on gs://cloudallinone-library
Enable object versioning for the bucket
# Change storage class
$ gsutil rewrite -s coldline gs://cloudallinone-library/object
Change object storage class to Coldline
# Parallel composite uploads
$ gsutil -o GSUtil:parallel_composite_upload_threshold=50M cp largefile.gz gs://cloudallinone-library/
Enable parallel uploads for large files
# Automation
$ 0 * * * * gsutil -m rsync -r "/var/www/mainweb.com/public/upload/" gs://cloudallinone-library/upload && gsutil -m acl ch -u AllUsers:R gs://cloudallinone-library/upload/
automation operation job
Pro Tip: Use the -m flag for multi-threaded operations to significantly speed up transfers: gsutil -m cp -r large-dir gs://bucket/
Warning: Always test commands with the --dryrun option first when performing destructive operations like sync with delete.

Automation Scripts

Windows Batch Script (sync-to-cloud.bat)
@echo off
echo Starting sync to AWS S3...
aws s3 sync "C:\Users\akashkumar\Downloads\images" s3://cloudallinone-library/upload/ --delete
echo.
echo Starting sync to Google Cloud Storage...
gsutil -m rsync -r -d "C:\Users\akashkumar\Downloads\images" gs://cloudallinone-library/upload
echo.
echo Sync completed!
pause
Ubuntu Shell Script (sync-to-cloud.sh)
#!/bin/bash
echo "Starting sync to AWS S3..."
aws s3 sync /var/www/mainweb.com/public/upload/ s3://cloudallinone-library/upload/ --delete
echo ""
echo "Starting sync to Google Cloud Storage..."
gsutil -m rsync -r -d /var/www/mainweb.com/public/upload/ gs://cloudallinone-library/upload
echo ""
echo "Sync completed!"
Windows Backup Script (backup-to-s3.bat)
@echo off
echo Starting backup to AWS S3...
aws s3 sync "C:\Users\akashkumar\Downloads\images" s3://cloudallinone-library/upload/ --delete
if %errorlevel% neq 0 (
  echo Backup failed!
  exit /b 1
)
echo Backup completed successfully!
pause
Linux Backup Script (backup-to-gcs.sh)
#!/bin/bash

# Configuration
LOCAL_DIR="/var/www/mainweb.com/public/upload/"
BUCKET="gs://cloudallinone-library/upload"
LOG_FILE="/var/log/backup.log"

echo "$(date): Starting backup" >> $LOG_FILE
gsutil -m rsync -r -d $LOCAL_DIR $BUCKET >> $LOG_FILE 2>&1

if [ $? -eq 0 ]; then
  echo "$(date): Backup completed successfully" >> $LOG_FILE
else
  echo "$(date): Backup failed" >> $LOG_FILE
  exit 1
fi
Windows Backup Script (backup-to-cloud.bat)
@echo off
echo Starting backup to AWS S3...
aws s3 sync C:\Users\akashkumar\Downloads\images s3://cloudallinone-library/upload/ --delete
if %errorlevel% neq 0 (
  echo AWS backup failed!
  exit /b 1
)
echo.
echo Starting backup to Google Cloud...
gsutil -m rsync -r -d C:\Users\akashkumar\Downloads\images gs://cloudallinone-library/upload
if %errorlevel% neq 0 (
  echo GCP backup failed!
  exit /b 1
)
echo Backup completed successfully!
pause
Linux Backup Script (backup-to-cloud.sh)
#!/bin/bash

# Configuration
LOCAL_DIR="/var/www/mainweb.com/public/upload/"
AWS_BUCKET="s3://cloudallinone-library/upload/"
GCP_BUCKET="gs://cloudallinone-library/upload"
LOG_FILE="/var/log/cloud-backup.log"

echo "$(date): Starting backup" >> $LOG_FILE

# AWS Backup
aws s3 sync $LOCAL_DIR $AWS_BUCKET --delete >> $LOG_FILE 2>&1
if [ $? -eq 0 ]; then
  echo "$(date): AWS backup completed" >> $LOG_FILE
else
  echo "$(date): AWS backup failed" >> $LOG_FILE
  exit 1
fi

# GCP Backup
gsutil -m rsync -r -d $LOCAL_DIR $GCP_BUCKET >> $LOG_FILE 2>&1
if [ $? -eq 0 ]; then
  echo "$(date): GCP backup completed" >> $LOG_FILE
else
  echo "$(date): GCP backup failed" >> $LOG_FILE
  exit 1
fi

echo "$(date): All backups completed successfully" >> $LOG_FILE

Feature Comparison

AWS S3 vs Google Cloud Storage Feature Comparison
Feature AWS S3 Google Cloud Storage
Storage Classes S3 Standard, Intelligent-Tiering, Standard-IA, One Zone-IA, Glacier, Glacier Deep Archive Standard, Nearline, Coldline, Archive
Availability 99.99% for Standard 99.95% for Standard
Durability 99.999999999% (11 9's) 99.999999999% (11 9's)
Maximum Object Size 5TB 5TB
Transfer Acceleration Yes (S3 Transfer Acceleration) No (but has parallel composite uploads)
Versioning Yes Yes
Object Lock Yes Yes (Retention policies)
Requester Pays Yes Yes
Static Website Hosting Yes Yes (with some limitations)
Pricing Model Per GB/month, requests, data transfer Per GB/month, operations, data retrieval, network egress

Cost Estimation

Simple Cost Estimator

Estimate monthly costs for your storage needs (prices are approximate and vary by region)

Estimated Monthly Costs
AWS S3: Calculating...
Google Cloud Storage: Calculating...
Note: This is a simplified estimation. Actual costs depend on storage class, region, number of operations, and other factors. Always check the official pricing pages for accurate calculations.

Pro Tips

  • Use the -m flag with gsutil for parallel operations (faster transfers)
  • Always test sync commands with --dryrun option first to see what will be transferred
  • For production environments, consider using IAM roles (AWS) or Service Accounts (GCP) instead of access keys
  • Set up lifecycle policies to automatically manage object expiration and storage class transitions
  • Monitor your storage costs regularly using AWS Cost Explorer or Google Cloud Billing reports
  • Use storage classes appropriately based on access patterns to optimize costs
  • Consider using CDN (CloudFront for AWS, Cloud CDN for GCP) for frequently accessed content to reduce egress costs