Sign in
Log inSign up
Using docker with NodeJS for local development

Using docker with NodeJS for local development

Mayank Dubey's photo
Mayank Dubey
·Sep 11, 2019

Setting up projects on different machines can be a tedious task because it takes a lot of time and sometimes version problem occurs. The new members have to waste a lot of time tuning the environment before they can start contributing. Wouldn't it be cool if you and your team members could work on the same code base without worrying about everyone's system configuration??

This is where Docker can help us. Docker allows you to locally run the same environment across all machines and with little tweaks it can also be used for production. In this blog, we are going to create a NodeJS web app with PostgreSQL using Docker.

What is Docker?

Docker is a tool used to create, deploy and run applications using containers. Containers allow us to create all in one package for our application with all the required libraries and environment dependencies. The application shipped as a container will run on any other Linux machine regardless of custom settings that could be different from the machine used to write and test the code.

So, you can share your application with anyone and they will be able to run and work on your application without much effort.

DockerFile, Docker Image & Containers

Dockerfile, Docker Images & Docker Containers are three important terms that you need to understand while using Docker.

DockerFile: DockerFile is a file which contains a set of commands used to create an image. These are the same commands you can run on the command line to create an image.

Docker Image: Docker Images can be thought of as templates used to create a container. Docker Images can be shared through the docker registry. Docker Hub is a public registry which allows multiple users to use and collaborate on the images.

Docker Container: In simple terms, a docker container is a running instance of an image.

Docker Compose

Docker Compose is a tool that let us easily define and run multiple containers. You write a YAML file known as a compose file (docker-compose.yml) which contains details about services, networks, and volumes to set up the docker application. You can create separate containers, host them and get them to communicate with each other.

Let's get a quick overview of services, networks, and volumes:

Service: Docker service will be the image for a microservice within the context of some larger application. When you create a service, you specify which container image to use and which commands to execute inside running containers. Examples of services might include an HTTP server, a database, or any other type of executable program that you wish to run in a distributed environment.

Network: A Docker network allows the services to communicate with each other or with other non-Docker workloads. We are using a network to connect the server container with the database container.

Volumes: Volumes are directories (or files) which exists out of the docker filesystem and exists as normal directories (or files) on the host filesystem (your machine). This allows us to bind our working directory with our server container.

Let's create a NodeJS app using docker

We will create our node app the usual way, with npm:

npm init

This will prompt you for several things, such as the name and version of your application. You can accept defaults for most of them except for entry point. For entry point enter app.js.

entry point: (index.js) app.js

This will create a package.json file which will look like this:

{
  "name": "app-name",
  "version": "1.0.0",
  "description": "",
  "main": "app.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "",
  "license": "ISC"
}

Now install express:

npm install express --save

Add following code to app.js file:

const express = require('express');

const app = express();

app.get('/', (req, res) => {
  res.send(
    `<h1>Docker is fun!</h1>`
  )
})

app.use((err, req, res, next) => {
  if (err) {
    console.error(err);
    res
      .status(err.statusCode || err.status || 500)
      .send(err || {});
  } else {
    next();
  }
});

const server = app.listen(3000, () => {
  console.log('App listening on port', server.address().port);
});

Add start script in package.json:

...
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1",
    "start": "node app.js"
  },
...

To have this run on your local machine, run the following command:

npm start

The application will start at localhost:3000.

Now that our node application is up and running, we can use docker. We will create services for nodejs and database.

Create Dockerfile and place the following code in it:

# Official docker image for Node.js
FROM node:10

# Create app directory
WORKDIR /app

# Install app dependencies
# A wildcard is used to ensure both
# package.json AND package-lock.json are copied
# where available (npm@5+)
COPY package*.json ./

RUN npm install
RUN npm install -g nodemon

# Bundle app source
COPY . .

EXPOSE 3000
CMD [ "nodemon" ]

Now we will create a compose file with details about the services, in our case web and database.

version: '3'

services:
  web:
    build:
      context: .
      dockerfile: ./Dockerfile
    ports:
      - 3000:3000
    volumes:
      - .:/app
    networks:
      - app-network
    depends_on:
      - db

  db:
    image: library/postgres:11.3-alpine
    restart: unless-stopped
    ports:
      - 10000:5432
    networks:
      - app-network
    environment:
      - POSTGRES_USER=db_user
      - POSTGRES_PASSWORD=db_password

networks:
  app-network:
    driver: bridge

Let's understand what is going on in this file:

  • We defined two services web and db. The web service is using a Dockerfile to build the image whereas db service building the image directly from the configuration provided.
  • The containers are using a network named app-network to communicate with each other. You can provide the database connection information to the web app.
  • As we will use these containers for local development, any updates in the local directory should be reflected into the container. To achieve this we will mount a volume. This volume will map local directory to the working directory in the container.

Now that we have a compose file, run this command to start our app:

docker-compose -f local.yml up

This command will build the images and then run them.

That's it! Now you will be able to share your project with anyone and all they have to do is run above command and start coding.

Hassle-free blogging platform that developers and teams love.
  • Docs by Hashnode
    New
  • Blogs
  • AI Markdown Editor
  • GraphQL APIs
  • Open source Starter-kit

© Hashnode 2024 — LinearBytes Inc.

Privacy PolicyTermsCode of Conduct