This brief post will show you how to copy file or files with aws cli in several different examples. It will cover several different examples like:

  • copy files to local
  • copy files from local to aws ec2 instance
  • aws lambda python copy s3 file

You can check this article if you need to install and download files with S3 client on Linux Mint or Ubuntu: How to Download Files from S3 Bucket with AWS CLI on Linux Min

Step 1: Copy files to local with aws cli

One the S3 client is installed and correctly configured(check the link above). You can copy files from a S3 bucket to your local machine by command:

aws s3 cp <S3 URI> <Local Path>

and example would be

aws s3 cp s3://some-space_bucket/my-file.txt .

The command above will copy the file located on bucket - s3://some-space_bucket/my-file.txt on the current working folder.

Step 2: Copy files to local with different AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY

What about if you like to get files from a bucket which has different AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY from the configured one. And you don't like to store them on your machine.

In this case you can use the next command to copy files with custom keys:

AWS_ACCESS_KEY_ID=xxx AWS_SECRET_ACCESS_KEY=xxx aws s3 cp s3://some-space_bucket/my-file.txt .

Step 3: Copy files from local to S3 bucket

If you need to copy a local file from your local machine to aws instance then you can use the next syntax:

aws s3 cp <Local Path> <S3 URI>

or as example:

aws s3 cp ./test.txt s3://some-space_bucket/my-file.txt

This will copy the local file test.txt from the current working folder to the bucket path: s3://some-space_bucket/my-file.txt

Step 4: Python aws lambda to copy S3 file(s)

If you need to create a aws lambda to copy files from S3 buckets you can check: Using an Amazon S3 trigger to invoke a Lambda function

The Python lambda to copy files is:

import json
import urllib.parse
import boto3

print('Loading function')

s3 = boto3.client('s3')


def lambda_handler(event, context):
    #print("Received event: " + json.dumps(event, indent=2))

    # Get the object from the event and show its content type
    bucket = event['Records'][0]['s3']['bucket']['name']
    key = urllib.parse.unquote_plus(event['Records'][0]['s3']['object']['key'], encoding='utf-8')
    try:
        response = s3.get_object(Bucket=bucket, Key=key)
        print("CONTENT TYPE: " + response['ContentType'])
        return response['ContentType']
    except Exception as e:
        print(e)
        print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))
        raise e
              

Step 5: Python aws lambda to copy S3 file from one bucket to other

If you need to copy files from one bucket to another with a aws lambda you can use next Python snippet:

import boto3
import json
s3 = boto3.resource('s3')


def lambda_handler(event, context):
    bucket = s3.Bucket('some-space_bucket-1')
    dest_bucket = s3.Bucket('some-space_bucket-2')
    print(bucket)
    print(dest_bucket)

    for obj in bucket.objects():
        dest_key = obj.key
        print(dest_key)
        s3.Object(dest_bucket.name, dest_key).copy_from(CopySource = {'Bucket': obj.bucket_name, 'Key': obj.key})

This will copy all files from some-space_bucket-1 to bucket some-space_bucket-2