Backing up Data using AWS Lambda

Poojitha Ravuri
5 min readApr 14, 2022

Hey all, let’s talk about a use-case of AWS Lambda in this tutorial!

Objective: If you upload an object in one S3 bucket, the backup of the object will automatically be stored in another bucket. The automatic backup happens with the help of the Lambda service.

Services needed for our usecase

Let’s start!!

Step 1

Go to https://aws.amazon.com/ and create an account using a credit card (this is used for security purposes only), if you do not have one, else log in to the console.

Step 2

Search for S3 in the search bar of the console.

Choose S3

When the S3 service is chosen, it opens into a page listing all of the storage buckets that are created by you. If it’s the first time, you should obviously not find any buckets created.

Create two S3 buckets, and set suitable names for your buckets. Note that bucket names should be unique.

On clicking the Create bucket, you will be redirected to a configuration page. This is the most important part, so follow carefully.

Step 3

By default, Object Ownership is set to ACLs disabled, this can lead to problems, thus, the ACLs enabled option should be chosen.

As stated in the Amazon documentation: Amazon S3 access control lists (ACLs) enable you to manage access to buckets and objects. Each bucket and object has an ACL attached to it as a subresource. It defines which AWS accounts or groups are granted access and the type of access.

On choosing the ACL enabled option, this is what it looks like

By default, public access to the bucket and its objects are NOT ALLOWED, but this wouldn’t be the case, as our application may involve users uploading files/folders which has to be stored on S3.

On unchecking the Block All public access, this option has to be enabled, of course, it is risky, allowing public access to the contents of your bucket, but again remember this is only for a development setup!

The other options can be left as is, and you can go ahead and create your bucket

Create another bucket for storing backup files in a similar way.

Check the created buckets on S3 dashboard.

Step 4 — Creating an IAM role and policy

Search for IAM in search box, and select it.

Now we need to create an IAM role, and attach the policy which is going to be created in further steps.

On the left side of dashboard, choose Policies.

Click on create policy.

Click on JSON and paste the below code:

{
"Version": "2012-10-17",
"Statement": [
{

"Effect": "Allow",
"Action": [
"s3:GetObject"
],
"Resource": [
"YOUR SOURCE BUCKET ARN/*"
]
},
{

"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": [
"YOUR DESTINATION BUCKET ARN/*"
]
}
]
}

Go to S3 dashboard, select your bucket and click Copy ARN to get ARN.

After pasting the code, choose Next: Tags, and add tags if require, and click Next: Review.

Give the name of policy and click on create policy.

Attach the policy to the Role:

Click on IAM role, and create role.

Give the usecase as lambda.

Click on next and add the policy which was created above.

Click on next and give the name of role.

Click on create role.

Step 5 — Creating Lambda function

Click on create function, fill all the basic details and in execution role, choose existing role and select the role which was created above. Use Node.js as runtime.

Pro tip: Ensure the lambda function and S3 objects are in same region.

Click on create function.

Paste the below code in editor. Replace sourceBucketName and destinationBucketName with your S3 bucket names.

var AWS= require("aws-sdk");exports.handler=(event,context,callback)=>{
var s3=new AWS.S3();
var sourceBucket="sourceBucketName";
var destinationBucket="destinationBucketName";
var objectKey=event.Records[0].s3.object.key;
var copySource=encodeURI(sourceBucket+"/"+objectKey);
var copyParams= {Bucket: destinationBucket, CopySource: copySource, Key: objectKey};
s3.copyObject(copyParams,function(err,data){
if(err)
{
console.log(err,err.stack);
}
else
console.log("S3 object copied successfully");
});
}

Now, open your lambda function and add a trigger.

Trigger is S3. Fill the remaining fields and add the trigger.

Step 6 — Testing our use-case

Go to S3 and open the source bucket. Add some objects and here comes the magic….

Added a file in source bucket

Open your backup bucket and damn…the objects which you added in source bucket got added into destination bucket.

Backup file created in destination bucket.

Hope you enjoyed the tutorial. Do give claps and follow me for more such insightful tutorials if you like my work.

Thank you!

--

--