This is a follow on from my previous post on setting up Node.js with TypeScript. In this tutorial, we'll take things a bit further and mix in a bit of fun with the Google Cloud Platform for good measure. You can get the code for the previous tutorial at this GitHub Repo.

What we will build

  1. Create a web application that serves an HTML form that allows users upload a profile with a document.
  2. Structure our application using a feature based architecture.
  3. Save these files to Google Cloud Storage.
  4. Save the resulting URL and form data to MongoDB using mLab.
  5. Create a cloud first web application, one that delegates things like data and file storage to other services.

Setting up the required accounts

Google Cloud Storage

As we are building our application in a cloud first manner it will only have the responsibility for handling web requests. That means we need somewhere to store the files we will upload, hence why we are going to use Google Cloud Storage.

So by following the links below we need to

  1. Create a Google Cloud Platform account
  2. Create a Storage bucket (the container for our files)
  3. Set up permissions, choosing to create service account credentials and generating a JSON key

We will also need the following environment variables set:

  1. GCS_BUCKET, the name of the bucket to save to.
  2. GCLOUD_PROJECT, this is your projectId. It can be found in the json credentials that you generated.
  3. GCS_KEYFILE, this is the path to the json key that you should have generated.

These are required for the multer-google-storage npm package that we will use to upload our file to Google Cloud Storage.

Mongo lab

To avoid having to setup MongoDB locally we are going to use the database-as-a-service provider mLab

By going to the docs located here you will need to do the following:

  1. Create an account with mLab
  2. Create a new database (on Google's Cloud Platform)
  3. Create a user on that database
  4. Take a note of the connectionstring to the database e.g: mongodb://<dbuser>:<dbpassword>

We will also need to make sure that we have the following environment variable set:

  1. MONGO_CONNECTION //the connection string with username and password

Creating the folder structure

As we are going to build more than a one file application we will want to place our files to be built in their own feature-based folder structure as shown below. The folders are relative to the root of our project.

├── src
|  ├── Profiles
|  |  ├── Controllers
|  |  |  └── profiles.controller.ts
|  |  ├── Models
|  |  |  └── profiles.models.ts
|  |  ├── Routes
|  |  |  └── profiles.routes.ts
|  |  ├── Services
|  |  |  └── upload.service.ts
|  |  └── Views
|  |     └── index.handlebars
|  ├── config
|  |  └── config.ts
|  └── index.ts

Why feature based?

So instead of architecting our application using a "vertical" model of separation (i.e bundling all route handlers/ controllers, business logic, and data access together), we aim for a "horizontal" split where the code that represents functionality is grouped together.

This has the following results:

  1. The structure of our application matches business requirements
  2. File paths for our imports to other files are easier to maintain
  3. This structure allows us to place tests next to the code files they test (without things getting crazy in terms of resolving our module imports).

Updating our compiler and build settings

As our app will be under the "src" folder we need to nominate an output directory for our built web app. This is done by updating our tsconfig.json:

"outDir": "./dist"

running the following command line:

npm install rimraf --save-dev

and also updating our build scripts in the package.json:

    "start": "npm run compile && node ./dist/index.js",
    "compile": "npm run clean && tsc",
    "debug": "npm run compile && node --inspect-brk ./dist/index.js",
    "clean": "rimraf ./dist",

We are using rimraf to delete our previous builds everytime we compile our application. This does the equivalent of the "rm -rf" command in Linux to recursively delete our files but will work cross platform. [The previous post](my previous post goes into more details of the workings of npm scripts.

Integrating MongoDB

So now we have our basic structure in place, the first thing we are going to do is ensure we can connect to our database.

First, we need to get a couple of packages by running the following commands:

npm install mongoose chalk --save


npm install @types/mongoose @types/chalk --save-dev

Chalk is just a nice utility that allows us to style our calls to "console.log" (useful to differentiate feedback from errors) and "mongoose" offers us a really nice way of interacting with MongoDB.

It is an object document mapper so it allows us to define the schema of an object and work with instances of those objects in our code. It is an implementation of the ActiveRecord pattern and it ensures that instances of our objects have methods to save and update records. For times where we do not have an instance of an object and wish to retrieve objects from our database, the record types that we define expose methods for retrieval from the underlying MongoDB collections. Mongoose is also a pretty good fit for TypeScript's type system.

First in the file "profile.model.ts" we need to add:

import * as mongoose from 'mongoose';

export interface Profile extends mongoose.Document{
     fileName: string;    
var profileSchema = new mongoose.Schema({
    title: String,
    description: String,
    fileName: String

export const ProfileRecord = mongoose.model<Profile>('Profile', profileSchema);

As well as defining our mongoose schema which mongoose requires, we have created an interface that represents the instances that we will be working with in our code. So the interface is for us (our code) and the schema declaration is how mongoose will map our object to the underlying database.

Now we need to write the code that deals with the plumbing required to ensure that our app can connect to our database. So in our config.ts we need to add:

import * as mongoose from 'mongoose';
import * as chalk from 'chalk'

const connString = process.env.MONGO_CONNECTION ||'';

export const dbConfig: mongoose.ConnectionOptions =  {
      useMongoClient: true

export async function  connectDb  (appStart: ()=>void ) {
  //as we should be using node > 7 we should have native promises
  (<any>mongoose).Promise = global.Promise;

    try {
        //try to connect using our configuration
        const db = await mongoose.connect(connString, dbConfig);
        //so we can see what is running as we develop
        mongoose.set('debug', true);
        if(appStart)//if we get this far launch the app

    } catch (error) {
        //using chalk to give any errors a foreboding red color        
        console.error('Could not connect to MongoDB!', error));

NB: You need to make sure you set up the MONGO_CONNECTION environment variable as we are reading from the global Node "process" object. If you do not want to do this (though it is best practice to do so) then you can assign a string value to the "connString" variable.

Here we are

  1. Setting mongoose to use the ES6 promise internally
  2. Connecting to the database using our connection string and the required options
  3. Setting mongoose to output db activity to the console
  4. Calling a function that we are about to write so as to actually start our Express app
  5. Logging out any errors using the "chalk" library so that they stand out.

If you see a red squiggly line

Depending on when you are doing this you may get feedback from your editor warning you of missing type information. If you do see the following post on how to polyfil missing types in TypeScript.

Setting up our view engine

So the first thing we need to do is setup Express to be able to receive web requests. We will need to add a view engine at this point so that we can serve our user interface. Assuming you are following on from the previous tutorial you should already have Express installed so we now need to get our view engine (handlebars) by running the following command line:

npm install express-handlebars --save


npm install @types/express-handlebars --save-dev

we now need to update our index.ts which will be the entry-point of our application.

import * as express from 'express';
import { connectDb } from './config/config';
import * as chalk from 'chalk';
import * as exphbs from 'express-handlebars';
import  profiles  from './Profiles/Routes/profiles.routes';

//declare out start up logic
const appStart = () => {
    const app: express.Express = express();
    //setup our features (of which there is one)
    const profilesViews = profiles(app);
    //setup our view engine
    app.engine('handlebars', exphbs({defaultLayout: 'main'}));
    app.set('view engine', 'handlebars');
    app.set('views', profilesViews);
    //start the app
    app.listen('3002', () => console.log('Server listening on port 3002')));

//connect to the db then start our app

So first we declare our startup logic which:

  1. Creates our app
  2. Passes our app to our features (which will register routes)
  3. Sets up a view engine
  4. Starts our app on port '3002'

Then pass this to the function to connect our the database. Our app start function will run once we are successfully connected to the database.

NB: We are yet to create the required module in the "./Profiles/Routes/profiles.routes" file. Consequently, you will get a red squiggly warning you about this and will not be able to compile at this point.

Creating our front end

So now we need to serve up a form to our users.

In our index.handlebars file we should enter

<!doctype html>
<html lang="en">

    <meta charset="utf-8">
    <title>Create Application</title>
    <meta name="description" content="File Upload">
    <link rel="stylesheet" href="">

    <div class="container-fluid">
        <div class="row">
            <div class="col-sm-6 well">
                <form action="/upload" method="post" enctype="multipart/form-data">
                    <div class="form-group">
                        <label for="name">Title</label>
                        <input type="text" name="title" class="form-control" id="title" placeholder="title">
                    <div class="form-group">
                        <label for="description">Description</label>
                        <textarea class="form-control" name="description" id="description" rows="3"></textarea>
                    <div class="form-group">
                        <label for="image">File input</label>
                        <input type="file" name="image">

                    <button type="submit" class="btn btn-default">Submit</button>


This is a basic HTML form with a link to the bootstrap CDN. One thing to note is the names of the form inputs, in particular, the name attribute value of our file upload. (image)
Then we need to add the following to our profile.controller.ts:

import { Router, RequestHandler, Response, Request, Express } from 'express';
import * as path from 'path';
import { ProfileRecord, Profile } from '../Models/profiles.models';

export const createProfile = (req: Request, res: Response) => {
  res.render('index', { layout: false , title: 'Please upload your application' });

This will allows us to handle a given request by returning our index view as the response.

So we have our markup as well as the logic to return it to the user, so now what we need to do is write the code that maps a request for a URL in our application to the code in our profiles.controller.ts.

In order to do this, we need to add the following to our profiles.routes.ts:

import { Router, Express } from 'express';
import { createProfile }  from '../Controllers/profiles.controller';
import * as path from 'path';

const router = Router();

router.get('/', createProfile);

const profiles = (app: Express) => {
  app.use('/', router);

    return path.resolve(__dirname,'../Views')

export default profiles;

Here we are using the Express Router class to register our routes (one at the moment) and registering all routes exported from this feature to the route '/'. So at the moment, the '/' route in our app is going to send the 'index' view in the views folder of our Profiles feature.

The last thing to note is our the default export which is imported by our index.ts file. In that file, if you remember:

const profilesViews = profiles(app);
//setup our view engine
app.engine('handlebars', exphbs({defaultLayout: 'main'}));
app.set('view engine', 'handlebars');
app.set('views', profilesViews);

we register our views and in the function that we are exporting in profiles.routes.ts, we both link our router object for this feature to the main app whilst computing the location of our views in the Profiles feature and then returning it. This is what gets registered in our appStart function.

Updating our build process

Currently, our build process only compiles TypeScript files to JavaScript.

As we will have view files then we need to make sure that these are copied to the correct location.

First, we need to install the following:

npm install copyfiles --save-dev

This will give us a cross platform way of copying files. Then we need to update our "compile" npm script to have the following:

"compile": "npm run clean && tsc && copyfiles -u 1 ./src/**/*.handlebars ./dist",

This will copy all files with the "handlebars" extension to our dist folder.

Integrating Multer and Google Cloud Storage

Now that we have created the route to serve our users the upload form, we now need to provide our users the ability to upload files.

We are going to use a wrapper around the "multer" library which is a great way to handle file uploads in Express as it will handle the saving of files in our form posts and automatically upload them to Google Cloud Storage. (As long as you have created the required accounts, credentials and environment variables).

First, we need to run the following command line:

npm install multer-google-storage --save


npm install @types/multer --save-dev

Then, we need to update our profiles.controllers.ts to:

import { Router, RequestHandler, Response, Request, Express } from 'express';
import * as path from 'path';
import { ProfileRecord, Profile } from '../Models/profiles.models';

export const createProfile = (req: Request, res: Response) => {
  res.render('index', { layout: false , title: 'Please upload your application' });

export const uploadProfile = async (req:Request, res: Response) => {

    const fileName = req.file.path;
    const { title, description } = req.body as Profile;

    const newProfile: Profile = Object.assign(new ProfileRecord(), { fileName, title, description });

    const savedProfile = await;


Our "uploadProfile" method reads the path of the file that has just been uploaded and creates a new instance of a Profile record then combines the form values (title and description) with the path of our file in Google Cloud Storage.

Note that we destructure the properties we want to save from the request body. We might use the following technique to avoid mass assignment though this topic is beyond the scope of this tutorial. For the number of properties that we are using our current approach is fine.

We then call our Profile record's save function which will save our record the collection called Profiles in the underlying database and then return the saved record to our user.

Whilst we are here we should also add the following function

export const viewProfiles = async (req: Request, res: Response) => {
  const profiles = await ProfileRecord.find({});

This will allow users to view all uploaded Profile records.

Adding our file handler and routes

So in the last section, we wrote logic to save the location of our uploaded file and you might have been wondering how this happens. Well, fear not as we are about to cover this now!

In our upload.service.ts we need to create the following:

import * as  multer from 'multer';

import MulterGoogleCloudStorage from 'multer-google-storage';

const createUploadHandler = multer({
  storage: new MulterGoogleCloudStorage()

export {

Here we are configuring "multer" to use the "MulterGoogleCloudStorage" engine to upload our files. This package will read the environment variables that we set earlier and use them to provide our credentials to Google's authentication mechanism. The function we are exporting will now be used in our profile.routes.ts.

So we need to update profile.routes.ts to have the following imports:

import { Router, Express } from 'express';
import { createProfile, uploadProfile, viewProfiles }  from '../Controllers/profiles.controller';
import { createUploadHandler } from '../Services/upload.service';
import * as path from 'path';

and the following routes:'/upload', createUploadHandler.single('image') , uploadProfile);

router.get('/profiles', viewProfiles);

By placing our call to our imported "createUploadHandler" function in front of our "uploadProfile" function we are telling Express to use this to handle HTTP Post requests to the URL "/upload" first and then to pass the request on to the next function.

The methods for handling HTTP requests on the router object are variadic functions. This means that they take a variable number of arguments, where the first argument is the part of the URL that we want to handle and the subsequent arguments are request handler functions that receive a reference to the current request, response as well as a function argument that if invoked will cause the next request handler to be invoked. This means request handlers can:

  1. Read information about the current request
  2. Do something based on the request.
  3. Then either form a response based on this, or invoke the next function that will cause the next request handler to be invoked.

By passing a parameter with a value of "image" to our call to the "single" method we are creating a request handler:

  1. That will look for a file uploaded from a form element (or HTML5 formdata) called "image" (we set this in our form earlier)
  2. Take this file and upload it to Google Cloud Storage
  3. Ensure that the location of the file will be available in the "req" parameter of any subsequent request handler function that is invoked (accessible via the "req.file.path" property).
  4. Invoke the next function to ensure that the next request handler (our "uploadProfile" function) is run.

The second route that we are registering is simply wiring up the functionality to read all Profiles in the database to the URL 'Profiles' via a get request.

Now we should run our app using:

npm start

And voila! We should now be able to upload data from our Profile form.

Final thoughts

So we have now created an Express.js web app with our Profile upload form using a feature based architecture that uses external Cloud based file storage (though the way we structured our modules it would be easy to swap this to either another cloud provider or other means of storage) and used another service to quickly get up and running with MongoDB.

We also used "mongoose" to work with our models using TypeScript to gain the benefits of strong typing as well as the ability to reliably use some of the latest features in JavaScript in our code.