Uploading files to AWS S3 Buckets

Well... I said that I was going to step away a little bit from AWS but seems like it wasn't that easy. Today, I'm writing about uploading files to S3 using a node.js backend service. 

This will be a short tutorial divided in two steps, first the front end development and then the backend work. so let's start.

Web site

I'm using an Angular application created with the Angular-cli and Bootstrap as the front-end framework to design the website, however, in this tutorial I'm not going to focus on how to setup all of this. For UI notifications, we are using ngx-toastr (if you don't know about it, look at my review here).

To create the file upload component and give some styles, I used the following code:

<div class="custom-file" style="width: auto;">
  <input type="file" class="custom-file-input" 
         accept="application/pdf"
         (change)="upload($event.target.files, fileInput)" 
         id="customPreReadFile" #fileInput/>
  <label class="custom-file-label" for="customPreReadFile">{{getFileName()}}</label>
</div>

As you can see, we are allowing only PDF files, but this restriction can be disabled or modified to meet your needs.

On the component code, we created two methods, the "upload" method called on the change event and "getFileName" to display an instruction text or the name of the file if one was already selected. The code for both methods is as follows:

upload(files: FileList, fileInput: any) {
  if(files[0].type.indexOf("pdf") === -1){
    this.toastr.error("The file selected is not a PDF.", "Error");
    fileInput.value = "";
    return;
  }
  this.toastr.info("Uploading file...");
  this.uploadService.uploadFile(files[0], this.identifier).subscribe(data => {
    this.toastr.success("File has been uploaded.", "Success");    
  });
}

getFileName(): string {
  var fileName = this.file ?
      this.file.name :
  'Upload File';
  return fileName;
}

The service method is the one that prepares the file to be sent to the node.js service as follows:

uploadFile(file: File, id: string): Observable<any> {
  const formData: FormData = new FormData();
  formData.append("file", file, '${id}/${file.name}');
  return this.httpClient.post(environment.apiEndPoint 
    + '/admin/upload/'
    + id, formData);
}

Node JS Service

Having configured all the required parts in the front end code, we need to adapt our Node JS service to receive the file. The service uses Express to configure the REST API, but we also use a package called formidable to process the form data sent from the Angular application easily. Similar to the Web Site section, I'm not focusing on how to setup the node service, but rather the exact code to process the file upload. 

Before digging into the code, I'll explain a little bit about what formidable does. In short, formidable parses the content of the form sent in the request and saves it to a local temporary location; from there, we can grab the file and do any logic we want with it.

The express endpoint code looks like this: 

 

var IncomingForm = require('formidable').IncomingForm;
var fs = require('fs');
router.post('/admin/upload/:id', function (req, res) {
    var id = req.params.id;
    var s3Uploader = new S3Uploader(req);
    var form = new IncomingForm();
    var fileName = "";
    var buffer = null;
    form.on('file', (field, file) => {
        fileName = file.name;
        buffer = fs.readFileSync(file.path);
    });
    form.on('end', () => {
        s3Uploader.uploadFile(fileName, buffer).then(fileData => {
          res.json({
            successful: true,
            fileData
          });
        }).catch(err => {
            console.log(err);
            res.sendStatus(500);
        });
    });
    form.parse(req);
});

Before moving to the next part to upload the file to S3, let's explain what we are doing here. After importing the necessary dependencies, inside of the request handler we are doing multiple things:

  1. Creating an instance of an "S3Uploader" helper to send the files to S3.
  2. Configuring the "IncomingForm" instance from formidable.
    1. Define an event handler when a file is processed by formidable that retrieves the file name and creates a buffer that we will send to the S3 service.
    2. Define an event handler when the form has been processed to call the upload file method in the S3 helper.
  3. Calling the parse method from Formidable to start the whole process.

The "S3Uploader" object has the following code:

var AWS = require('aws-sdk');
function S3Uploader(request) {
  var jwtToken = request ? request.headers.cognitoauthorization : null;
  let credentials = {
    IdentityPoolId: "<IDENTITY POOL ID>",
    Logins: {}
  };
  credentials.Logins['cognito-idp.<COGNITO REGION>.amazonaws.com/<USER POOL ID>'] = jwtToken;

  AWS.config.update({
    credentials: new AWS.CognitoIdentityCredentials(credentials, {
      region: "<COGNITO REGION>"
    }),
    region: "<S3 BUCKET REGION>"
  });

  let s3 = new AWS.S3();
  function uploadFile(key, file) {
    var s3Config = {
      Bucket: "<BUCKET NAME>",
      Key: key,
      Body: file
    };
    return new Promise((resolve, reject) => {
      s3.putObject(s3Config, (err, resp) => {
        if (err) {
          console.log(err);
          reject({success: false, data: err});
        }
        resolve({sucess: true, data: resp});
      })
    });
  }
}

If the first part about configuring the AWS SDK to use proper credentials, I invite you to read my post on how to manage credentials properly using Cognito or even an older post where I explain how to use Cognito and the Federated Identities to create users with roles that can access AWS resources. 

In short, what we are doing is, retrieving the authentication token generated by cognito when the user logs in so that we can configure the AWS SDK use the permissions from the user. 

After all that, we just need to instantiate an object to use the S3 APIs and send the data to the bucket.

If you have any comment, don't hesitate in contacting me or leaving a comment below. And remember to follow me on @cannyengineer to get updated on every new post.