Node.js, MongoDB, Elasticsearch and PostgreSQL: The Compose Grand Tour

Published

Welcome to the Node.js segment of the Compose Grand Tour. If you're just joining the series, we're on a trip to visit all the databases of Compose with as many languages and drivers as possible.

More importantly, we're doing this with a functional, if somewhat abstract little web application which provides the same code base for all the example connections. This is so you, the reader, can easily see what's the web application and what's the driver rubber hitting the database road. It's an application which lets you enter a word and a definition, which it then stores and updates the display of the current list of stored values.

The examples use a JavaScript front end which is shared between all the Grand Tour examples. For each language, the code is essentially a web server which serves up static web pages and responds to requests to PUT a word and definition into the store and GET the current words and definitions out as a JSON object. All super simple, and to make it easier to see where the database code is, we isolate those webby actions out into their own functions.

So, let's quickly show you the example we've built for Node.js - it's an Express framework application which, as well as serving up static pages, has addWord and getWords functions. These functions are called by routes to "/words" for GET and PUT methods. The two functions return promises so that the web-facing code can handle their execution more predictably. If you look at the code for say get /words you'll see the function getWords() being called and then a .then() handling the successful execution and a .catch() handling errors:

app.get("/words", function(request, response) {  
  getWords()
    .then(words => {
      response.send(words);
    })
    .catch(err => {
      console.log(err);
      response.status(500).send(err);
    });
});

You'll find this and the rest of the code in the example repository in compose-grandtour/node.

Now, let's get our engines running and see that example connected to MongoDB, Elasticsearch, and PostgreSQL.

MongoDB

The MongoDB example sits in the example-mongodb directory. With all the Node.js examples, download the code, cd into the directory and run npm install to get the library packages in place.

The MongoDB Driver

This example uses the official MongoDB Node.js driver. The latest documentation for that driver is can be found here. This is the most widely used driver and other drivers tend to use it for all the low-level connectivity. Loading the client library in the code is basically:

const MongoClient = require("mongodb").MongoClient;  

The MongoDB Connection

The preamble to the connection involves us pulling two environmental variables, COMPOSE_MONGODB_URL, a connection string for the database, and PATH_TO_MONGODB_CERT, the path to a self-signed certificate for the database.

let connectionString = process.env.COMPOSE_MONGODB_URL;  
let connectionCertPath = process.env.PATH_TO_MONGODB_CERT;  

After checking we have the URL for the connection string, we decide based on the presence of a setting for the PATH_TO_MONGODB_CERT. If it's set, we add in the options needed to create a TLS secured connection, reading in that certificate and setting the sslCA option and turning on ssl and sslValidate:

if (connectionCertPath) {  
  var ca = [fs.readFileSync(connectionCertPath)];
  options = {
    ssl: true,
    sslValidate: true,
    sslCA: ca
  };
}

The actual connection is at the end of the code. To ensure the database is connected before we start the local web server, we wrap the app.listen() call in the then of the connection to the database:

MongoClient.connect(connectionString, options)  
  .then(db => {
    mygrandtour = db.db("grand_tour");
    app.listen(port, function() {
      console.log("Server is listening on port " + port);
    });
  })
  .catch(err => {
    console.error(err);
  });

We call connect which returns a promise, then we take the result of that promise, a database connection and use it to get a handle on the grand_tour database copying that to a global variable, mygrandtour. Once we have that, we call app.listen() and the web server is running.

The MongoDB Read

Recall we said the getWords() function returns a promise. Well, here we make full use of that:

function getWords() {  
  return mygrandtour
    .collection("words")
    .find()
    .toArray();
}

It's a one-liner really, but we use Prettier to make it more readable. Ok, we start with mygrandtour our database connection, then we call .collection("words") to get a handle on the collection we want to query. Now we call the collection method .find() with no parameters - we just want all the data. Now, find() returns a cursor with which you'd normally traverse the results.

We just want to grab all the data, so we'll use toArray() which is a cursor method. This turns all the results into an array and it returns a promise, and that promise gives us a results array when done. So all we have to do is return toArray()'s promise.

The MongoDB Write

Adding a word is even simpler. There's no cursor involved this time:

function addWord(word, definition) {  
  return mygrandtour.collection("words").insertOne({
    word: word,
    definition: definition
  });
}

As with the read, mygrandtour.collection("words") gets us to the collection. The collection method insertOne() is what we want to call on the collection; this inserts one document into the collection. We define that document with a simple JSON/JavaScript object. The insertOne() method returns a promise which passes the number of documents inserted if successful, so we return the promise and await a response.

We've connected, read and written to MongoDB from Node.js now, so let's move on. Next up, Elasticsearch.

Elasticsearch

The Elasticsearch example sits in the example-elasticsearch directory.

The Elasticsearch Driver

For this example, we're going with the official JavaScript/Node.js driver Elasticsearch-js. It's a one-on-one mapping to the low-level REST API of Elasticsearch and you'll find all the official documentation, including the API by Elasticsearch version, on the Elastic elasticsearch.js site. Getting it loaded is simply:

let elasticsearch = require("elasticsearch");  

The Elasticsearch Connection

There's a client object to create for Elasticsearch. With Elasticsearch being one of the Lets Encrypted secured Compose databases, there's no need for a certificate; all you need is a URL. We'll get that from the COMPOSE_ELASTICSEARCH_URL environment variable.

let connectionString = process.env.COMPOSE_ELASTICSEARCH_URL;  

We can actually put a comma-separated list of URLs in this environment variable. The client doesn't handle lists like that so we'll split it up into an array of hosts before creating the client.

let hostList = connectionString.split(",");

let client = new elasticsearch.Client({  
  hosts: hostList
});

Now we have the client, it's important to realize this has been passively created; it doesn't do anything till the first call is made on it. It's bad form to lazily fail like that, so, if we skip to the end of the example, we'll test the connection up front.

client  
  .ping()
  .then(resp => {
    // Check for an existing index
    checkIndices();
    // Listen for a connection.
    app.listen(port, function() {
      console.log("Server is listening on port " + port);
    });
  })
  .catch(err => {
    console.log("Elasticsearch server not responding");
    process.exit(1);
  });

We use the client's ping() method to do a no-op round trip to the server. If we are successful, we check the indices we need exist and start our local web server. If not, an error is caught and we exit. Let's have a quick look at that indices check.

function checkIndices() {  
  client.indices
    .exists({
      index: "grand_tour"
    })

It starts off by calling the indices.exists() method, checking for a "grand_tour" index. This returns a promise which we'll handle with a .then():

    .then(exists => {
      if (exists === false) {

If the index doesn't exist, we go to create it.

        client.indices
          .create({
            index: "grand_tour",
            body: {
              mappings: {
                words: {
                  properties: {
                    word: { type: "text" },
                    definition: { type: "text" },
                    added: { type: "date" }
                  }
                }
              }
            }
          })

We're creating an index called "grand_tour", which has a mapping called "words", and that has three properties, word, definition and date. Again this returns a promise, but we're mostly interested in errors from this and it's surrounding then() clause so:

          .catch(err => {
            console.error(err);
          });
      }
    })
    .catch(err => {
      console.error("Problem checking indices exist");
    });
}

Sometimes the preparation takes more code than the operating code. Let's move on to writing to this, potentially new, index.

The Elasticsearch Write

Our addWord function is going to return a promise and the client.index() method returns one that resolves when the new data has been added. There's a small difference between this and other examples. As you may notice, we have a date field and we populate that with the current time. Then we call client.index() specifying the index and type we want to store this word and definition against. There's also the body parameter, a JSON object with our values including the date.

function addWord(word, definition) {  
  let now = new Date();
  return client.index({
    index: "grand_tour",
    type: "words",
    body: {
      word: word,
      definition: definition,
      added: now
    },
    refresh: "wait_for"
  });
}

Finally, there's that refresh: "wait_for". That makes sure that before returning, the update is visible on all the shards bu waiting for a refresh to happen. Read about that in the Elastic documentation for refresh.

The Elasticsearch Read

Now we come to read our words and definitions using our getWords() function. We start this by returning a promise generated by the [client.search()](https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/api-reference.html#api-search) method. This, like other Elasticsearch calls, takes a JSON object of parameters. In this case, it passes values for the index, type, fields we want to be returned (_source) and a body. The body can specify search query parameters and more. Here we only want to make sure things are sorted by the added date field.

function getWords() {  
  return client
    .search({
      index: "grand_tour",
      type: "words",
      _source: ["word", "definition"],
      body: {
        sort: {
          added: {
            order: "desc"
          }
        }
      }
    })

But it's not quite over. we still have to extract the words and definitions from the results. We'll catch the results in a then() and then make and return new promise. This promise runs through the results and pushes all the words and definitions into an array. When it's done, it resolves with those words. By the magic of promises, this will be what gets returned to the caller of getWords().

    .then(results => {
      return new Promise((resolve, reject) => {
        let words = [];
        results.hits.hits.forEach(function(hit) {
          words.push({
            word: hit._source.word,
            definition: hit._source.definition
          });
          resolve(words);
        });
      });
    })
    .catch(err => {
      console.error(err);
    });
}

There's a catch() on the end to pick up any failure by any of the promises. And that's reading some data from Elasticsearch, with an extra "promisey" flavor.

PostgreSQL

This time, the PostgreSQL example sits in the example-postgresql directory.

The PostgreSQL Driver

We're using the pg library, also known as node-postgres to manage out connections. An npm install pg is sufficient to get it added to your project. We also use pg-connection-string directly; it's a library "pg" uses so it should come in as a dependency. Nothing else is needed.

The PostgreSQL Connection

To start our connections, we're going to pull in the COMPOSE_POSTGRESQL_URL and PATH_TO_POSTGRESQL_CERT environment variables. The former will be the connection string from the Compose overview, the latter the self-signed certificate available from the same page.

const pg = require("pg");

let connectionString = process.env.COMPOSE_POSTGRESQL_URL;  
let connectionCertPath = process.env.PATH_TO_POSTGRESQL_CERT;  

We exit if the URL isn't set. The next thing to do is read in that certificate and prepare the connection string.

let caCert = fs.readFileSync(connectionCertPath);

const parse = require("pg-connection-string").parse;  
let config = parse(connectionString);

config.ssl = {  
  ca: caCert
};

let client = new pg.Client(config);  

As you can see, we pull in the pg-connection-string package to handle parsing the connection string. The pg library uses it internally to turn connection strings into arrays of parameters. We do that here so we can add in other TLS/SSL settings on the fly, specifically, the self-signed certificate. With that in place, we can create a new pg Client instance using the config.

Skip to the end of the code and you'll see where we use that client to connect. Let's take this step by step.

client  
  .connect()
  .then(() => {
    return client.query(
      "CREATE TABLE IF NOT EXISTS words (word varchar(256) NOT NULL, definition varchar(256) NOT NULL)"
    );
  }) 

Here, we call client.connect() to kick off the connection process. This returns a promise so when that's done, we then run a client.query() to create our needed table if it doesn't exist of course. This also returns a promise, so we return that and by the magic of promises, that lets us chain another .then() on...

  .then(resp => {
    // Listen for a connection.
    app.listen(port, function() {
      console.log("Server is listening on port " + port);
    });
  })

So that when the query successfully executes, we start up our local web server. If there were any errors, we'll catch them down below.

  .catch(err => {
    console.error("Connection error", err.stack);
    process.exit(1);
  });

So now we are connected and serving web pages. On to reading the data from the database.

The PostgreSQL Read

This is going to be short and sweet, partly thanks to SQL. Here's the entire function for getWords().

function getWords() {  
  return client.query("SELECT * FROM words ORDER BY word ASC").then(results => {
    return new Promise((resolve, reject) => {
      resolve(results.rows);
    });
  });
}

We send out SQL query to retrieve all the fields from our words table, sorted in ascending alphabetical order. We capture the results from this promise with a then(). The results we want are in the .rows field of the results, so to return that in this scenario, we create a promise that resolves to result.rows.

The PostgreSQL Write

The write for PostgreSQL is really short.

function addWord(word, definition) {  
  let queryText = "INSERT INTO words(word,definition) VALUES($1, $2)";
  return client.query(queryText, [word, definition]);
}

It could be shorter if we in-lined the query text, but we haven't to make things clearer. The SQL query inserts two values into our words table. Those values are passed to client.query() in an array. Again, the promise generated by the query is returned by the function.

And that's it for our Grand Tour PostgreSQL example.

Checking in

So, that's MongoDB, Elasticsearch and PostgreSQL examples for Node.js applications. We've gone for modern JavaScript, using promises, but not too modern - we've not used async and await mainly because not everyone can use the latest Node.js in production. On the next stage of the Grand Tour, we'll take on Redis and RabbitMQ. See you in part two!


Read more articles about Compose databases - use our Curated Collections Guide for articles on each database type. If you have any feedback about this or any other Compose article, drop the Compose Articles team a line at articles@compose.com. We're happy to hear from you.

attribution Livin4wheel

Dj Walker-Morgan
Dj Walker-Morgan was Compose's resident Content Curator, and has been both a developer and writer since Apples came in II flavors and Commodores had Pets. Love this article? Head over to Dj Walker-Morgan’s author page to keep reading.

Conquer the Data Layer

Spend your time developing apps, not managing databases.