Getting Started with Elasticsearch using Compose


Hi again! In this latest part of our tour through getting started with the multiple, or singular, database joys of Compose we'll be looking at setting up Elasticsearch. In the first part, we covered MongoDB on Compose and the sign up process is basically the same for both. If you didn't read that article, have a browse now...

There is of course one difference, obviously you'll need to select Elasticsearch in the Choose a Database panel and take a note that the Elasticsearch pricing is different to MongoDB's. If you're on the thirty day trial, that won't bother you immediately, but do take a note.

Anyway, back to the Elasticsearch database setup. Once you have signed in, you'll be greeted by the Jobs screen showing that Compose has created your Elasticsearch database.

You're next stop should be to select the Overview option in the sidebar. This will show you various, useful, items of information at the top, but, right now, it'll be showing a message at the top of the page...

But I already have a user!

Yes, but what you've got is an account which you can log in to with your email address and password for the Compose Dashboard and your own Compose account. Those credentials are only ever used to log into the Compose Dashboard. As we explain in the MongoDB walkthrough, each database has it's own credential system to be managed. For Elasticsearch those credentials are usernames and passwords for the HTTP/TCP access portal which protects your Elasticsearch cluster.

There's actually two HTTP/TCP access portals to enable redundancy in accessing the cluster and both get automatically updated with the same credentials. All you need to do is create your user/password pair and you can connect to either of the portals (or both if your driver supports multiple URLs to connect with). Click on Users in the left hand sidebar and then click Add user.

You'll be prompted to enter a username and password here; make them different from your Compose account login at the very least.

After you have clicked Add user in this screen, you'll be redirected to the Jobs display, where you should see the user being added to each of the access portals. Once that's done, it is time to connect. The next question is...

Where to connect?

That can be answered on the Overview page. If you go there you'll find a number of "Connect Strings" listed in the first panel. As Elasticsearch deployments have two access portals, there's two URL's listed under the "HTTP connection" section.

Either one of them will work – we'll just use the first one shown here for this example. We then substitute in our user name and password to get:

And we're ready to go. Well, at least once we have an application that can use that URL. If we just want to test connectivity to the Elasticsearch cluster, we can use the example Cluster Health Call. This uses the curl command line utility which is widely available. If your operating system doesn't have curl then you can download it from Then you can substitute in the user name and password and ask the cluster about its health like so:

$ curl --user example:examplepass ''
  "cluster_name" : "runstate-elasticsearch",
  "status" : "green",
  "timed_out" : false,
  "number_of_nodes" : 3,
  "number_of_data_nodes" : 3,
  "active_primary_shards" : 0,
  "active_shards" : 0,
  "relocating_shards" : 0,
  "initializing_shards" : 0,
  "unassigned_shards" : 0,
  "number_of_pending_tasks" : 0

Now, you may notice that we used a slightly different URL there, without the username:password embedded in it. We can use the URL we created earlier by appending /_cluster/health?pretty like so:

$ curl ''

Remember to wrap it in quotes so that the shell doesn't see the ? and try and match files using the URL though. If you don't have curl but do have wget you can use the URL with that command like so:

$ wget -O - '' Anyway, assuming this worked, you have a connection to your Elasticsearch database. The "Cluster Health" URL is a query on Elasticsearch's REST API and you could, if you really wanted to, access it entirely from the command line. But you will probably want to go for the far less taxing route of using a library.

A bit of Node.js

We'll just create a small Node.js application now to show how you can connect to your Elasticsearch cluster. Create an Node.js project with npm init and just press return to agree to all the settings. Now, run npm install elasticsearch --save to install the Elasticsearch module. Now we can write some code...

var elasticsearch=require('elasticsearch');

var client=new elasticsearch.Client( {
  hosts: [

First, this code "requires" the elasticsearch module. Then it moves on to create an Elasticsearch client. The Client only takes one parameter, but that just is a way of wrapping up a lot of options to create connections to the Elasticsearch cluster. If you check the documentation you'll see the full extent. Here though we're keeping it simple and just passing the host key with an array of URLs to connect to as its value. The first URL is the one we've been using and the second URL is the second one from the overview's "Connect Strings". Let's go query the cluster health in JavaScript...{},function(err,resp,status) {

Ok, we can issue the call, to get cluster health and when we get the response, we run the callback which just prints the response. The {} at the start is where the options get placed and looking at the API entry for we can see there's some useful options to be harnessed, like the ability to wait for particular statuses or availabilities. This of course is just an example. Now we have a client we can use the Elasticsearch Quick Start examples and the rest of the documentation to master talking with Elasticsearch.

A note on site plugins

There is another way to interact with the Elasticsearch cluster and that's through the web-based site plugins. You'll find them by selecting Plugins from the side bar. There's Kibana, ElasticHQ, Bigdesk, Head, Paramedic and Kopf, which all have different capabilities, from just monitoring cluster health to helping with query creation and execution.

Moving on with Elasticsearch

You're next steps with Elasticsearch should be to...

Dj Walker-Morgan
Dj Walker-Morgan is Compose's resident Content Curator, and has been both a developer and writer since Apples came in II flavors and Commodores had Pets. Love this article? Head over to Dj Walker-Morgan’s author page and keep reading.