Algolia indexing with Serverless Webhooks

June 29, 2018 0 Comments

Algolia indexing with Serverless Webhooks

 

 


Creating a 'quick script' to achieve a task has, for a long time, been the domain of whatever you can achieve on your local machine.

Most of the time this is fine, but should you come across the need to have your handy little script-o available outside of your 'on' hours you need to get busy with making sure it's deployed somewhere that will always be available.

For years, Heroku was my choice for this. Their toolbelt was always installed on my machine and the related commands to get something live baked forever into my brain from extremely regular usage.

Yet, this approach did always feel like overkill for something simple. Coupled with the fact that the free dynos would eventually spin down and then need a little longer to wake up in low traffic situations. Regardless, I continued.

I continued until Functions as a Service became a thing, and matured to a point where I could use a CLI toolbelt to achieve the creation and deployment of a small project and be able to choose which infrastructure I deployed it on.

Enter... Serverless

If you've been keeping even half an eye on the effort that Google and Microsoft have made to catch up with Amazon's dominance over developer targeted cloud infrastructure, you'll have likely also come across the Serverless project as well.

It's great having so much choice when it comes to cloud infrastructure for functions, but having to install and then learn the commands for all the CLI tools for Amazon AWS, Google Cloud, Microsoft Azure and others is more overhead that I'm sure most of us could do without.

Serverless means you don't have to. Allowing you to not only create projects for any of the above services but deploy them as well. All using the exact same set of CLI commands.

Let's build a thing!

One of the main uses I have for something like this is to build webhooks that I can use to store information in my Algolia indexes.

In this example we'll do the following:


  • Use Serverless to create a project that will take any JSON object that is sent to it via POST request and store it in a specific Algolia index.

  • Deploy our code to Google Cloud and check that it works using CURL.

Prerequisites

If you're going to follow along, you will need:


  • NodeJS (and more specifically, NPM)

  • The Serverless CLI installed

  • An account with Google Cloud

Scaffolding

In a folder of your choice, start by initialising a new project with Serverless

serverless create --template google-nodejs --path serverless-indexer 

Above we're telling Serverless to create a new project and to use the template for a NodeJS project hosted on Google Cloud. The path flag at the end specifies the name of our project.

Next, go into the newly created folder and install the dependencies

You will now need to set up the authentication credentials for your Google Cloud account. It's not a long process but it is well documented here.

Adding the code

Open the index.js in your editor, it will look like this:

'use strict'; exports.http = (request, response) => { response.status(200).send('Hello World!'); 
}; exports.event = (event, callback) => { callback();
};

Here you have the base code for a function that will respond to HTTP requests and another that will respond to event-based invocation.

We'll only be dealing with HTTP for this project so you can remove the second function.

Next, we'll add the code we need to work with Algolia starting with their SDK for JavaScript.

Install it using NPM (or Yarn, if that's your bag):

npm install --save algoliasearch 

Like any NodeJS project, this will add the Algolia library to the package.json, which will also be deployed to Google Cloud later on.

Initialise the library like so:

'use strict' const algolia = require('algoliasearch'); 
const algoliaApp = algolia('your-app-id', 'your-admin-api-key');
const index = algoliaApp.initIndex('serverless-demo'); // Your function continues here...
exports.http...

Right now, we're exporting a function called http which isn't particularly descriptive. Let's change it to updateIndex.

'use strict'; const algolia = require('algoliasearch'); 
const algoliaApp = algolia('your-app-id', 'your-admin-api-key');
const index = algoliaApp.initIndex('serverless-demo'); // Change the function name from http to updateIndex
exports.updateIndex = (request, response) => { };

Next, we'll add the code to take our incoming JSON object and write it to our Algolia index.

From the Algolia documentation the method we need is addObject. We're going write it using Promises rather than the callback style used in the docs.

exports.updateIndex = (request, response) => { index.addObject(request.body) .then(newAlgoliaObject => { response.status(200).json(newAlgoliaObject); }) .catch(err => console.log(err)); 
}

Here we're adding the body of any incomning request to our Algolia index and if that happends successfully then we return the object data back as a JSON response.

If it fails, we write the error out to the console.

Our finished script will now look like this:

'use strict'; const algolia = require('algoliasearch'); 
const algoliaApp = algolia('your-app-id', 'your-admin-api-key');
// Remember you will need to create a new index in Algolia
const index = algoliaApp.initIndex('serverless-demo'); exports.updateIndex = (request, response) => { index.addObject(request.body) .then(newAlgoliaObject => { response.status(200).json(newAlgoliaObject); }) .catch(err => console.log(err));
}

This is a bare bones script that doesn't do much error checking or sanitising of the incoming data. For real production usage you would want to add more of this to ensure you get what you expect.

Time to deploy

Along with your index.js, Serverless will have also created a file called servless.yml. Open up this file in your editor and we'll make the changes needed to deploy the script to Google Cloud.

The section that needs updating is functions. This is where you define what functions were written in your index.js and what they do.

The boilerplate that Serverless generates, looks like this:

functions: # Defines the functions section first: # The name of your function handler: http # The exported function events: # Defines the events that you call - http: path # Sets this function as an HTTP function 

With all that in mind. Make the following changes:

functions: updateIndex: # Change the function name handler: updateIndex # Change the handler events: - http: path 

If you followed the sets in the Serverless documentation to set up your keyfile.json that contains the authentication keys for Google Cloud, you'll now be ready to deploy.

If all goes well you'll get a URL for your newly deployed function, that will look something like this: https://us-central1-serverless-indexer.cloudfunctions.net/updateIndex

Time to test it

Open up a new terminal window. Then modify the following command to include the URL that you were given when you deployed your function to Google Cloud using Serverless.

curl -X POST https://us-central1-serverless-indexer.cloudfunctions.net/update -d '{"product":"iPhone X", "price":"1199", "capacity":"256gb"}' 

The response you can expect is what the Algolia API would typically respond with when a new object has been written to an index. Something like this:

{ "createdAt":"2018-06-25T19:27:59.353Z", "taskID":107402530, "objectID":"120246040" 
}

Done!

There are a lot of other great combinations of serverless functions and Algolia that you could make using Serverless that use much more of the available APIs, so dig deep and see what you can come up with.

If you're looking for a little extra inspiration, check out the Essential Mix API. A project that I created last year that uses Amazon's Lambda functions and an Algolia index as a database.


Tag cloud