Blog

Copy Files/Directories from AWS S3 bucket to your local machine.

copy-file-from-s3-bucket-to-ec2-local
AWS / S3

Copy Files/Directories from AWS S3 bucket to your local machine.

Limitless objects with each size up to 5TB are stored using S3 bucket in AWS. Buckets allows to store objects that are related to one another.

As you are CloudOps or DevOps engineer, if you want to make an automate copy files/directories from the AWS S3 bucket to your local machine or any EC2 instances.

This post will helps you to find in the Last one hour files or directories and copy those files/directories from AW S3 bucket to in your Local machine or AWS EC2 instance using the Shell script,

Before, you develop the shell script make sure that are you able to access the AWS S3 buckets from your local machine. Don’t have permission, create a new IAM User account and generate a AWS Access Key and Secret Key.

Install an AWS CLI package in your machine, by default the aws-cli is installed in Amazon-Linux2 instance if you want to copy there.

So, lets create a new shell file named: copy-s3-to-local.sh

#!/bin/bash 

BUCKET_NAME="cloudishsoft_files"
DATE=`date "+%Y-%m-%d"` 
HOUR_NOW=`date "+%H"` 
HOUR=`expr $HOUR_NOW - 1
FILENAME=($(aws s3 ls s3://"$BUCKET_NAME"/ --recursive | sort | grep "$DATE" | grep "$HOUR:" | awk '{print $4}' )
 
for FILE in "${FILENAME[@]}" 
do 
  sleep 2s 
  aws s3 cp s3://"$BUCKET_NAME"/$FILE /your_destination/  --recursive
done 

Give the writable permission using the command below,

$ chmod +x copy-s3-to-local.sh

or

$ chmod 755 copy-s3-to-local.sh

Finally, execute the script with the command below,

$ sh copy-s3-to-local.sh

I hope so, everything should be fine and files/directories are coped as your expected.

Spread the love

Leave your thought here

Your email address will not be published. Required fields are marked *