Create and Deploy Your First JavaScript Serverless Worker | by Tommaso De Ponti | Jul, 2022

Beginner guide to building and deploying serverless workers for free

Photo by Viktorija Lankauskaitė on Unsplash

Last week I wrote an article about the tools I use the most when working with JavaScript. Cloudflare workers and wranglers were on that list. Today, you’ll get to know why and how I use them.

You’ll learn to deploy a serverless worker, which is extremely useful in many situations.

For example, you can compress IPFS pinning in a simple worker request and allow users to upload to IPFS while still masking your API keys as you’d do in a traditional backend.

I also plan to make this a series to cover creating APIs on CF workers and explain topics like dealing with CORS, interacting with the native KV store, and so on. If you are serious about building scalable serverless APIs, stay tuned.

Let’s start with the basics, what are CF workers, and how do they work?

CF workers provide an environment to create applications without taking care of the infrastructure. They are directly run by the Cloudflare network, which consists of various servers distributed all across the globe. In addition, the runtime offers most of the APIs that are implemented on modern browsers.

As a direct result of being serverless, they are incredibly easy to set up, and thanks to Cloudflare’s design, worker scripts are also easy to build.

Whenever a request is made to a worker domain (or cf-managed domain), Cloudflare passes the event to the worker’s handler. From there, the worker script can compute a response as per the code in the script.

The first thing you’ll need is a Cloudflare account. Then you have to install wrangler (preferably version > 2.0, npm install -g wrangler). Wrangler is the CLI tool we will use to test and publish our serverless apps (among other things like KV stores, etc.).

You’re all set now and just have to create a new project. For the sake of simplicity (and efficiency in this case), we are going to start a blank project. Open your terminal and type: wrangler init serverlessworker , then move into the directory. The wrangler init The command will ask for a few things, you can either do as you wish, or you can do the following config I have used for this article: use git? > n, create package.json? > Y, use typescript? > n, create worker at index.js? > fetch bundle .

As I introduced before, upon request, an event is passed to our worker’s handler. The index.js file is where you handle this event. By default, the handler looks like this:

export default {
async fetch(request) {
return new Response("Hello World!");
},
};

You can see the handler takes the request as a parameter and then returns a Response . You will always use the request parameter unless your worker only has to perform one task, and it’s always that same task upon every request, which is an unlikely scenario.

Since we want to build a worker that does two tasks, we will need data like URL pattern, headers (if you plan on using authentication/authorization), body, method, and so on.

For example, let’s say we wanted to return have an endpoint that returns information about our worker:

You can see we are seeing whether the URL of the request matches the endpoint (for the sake of simplicity, I explicitly matched the whole URL without constructing the object and only grepping the endpoint).

If it does match the /info endpoint, it will return worker info, if not a default hello world. We are not going to build our router, but this example shows what a very basic cf worker handler would look like. You can go ahead and test it (remember to change the URL to your worker) with wrangler dev.

In this paragraph, we will create a very simple worker function and ship it in an API-alike format. We will create an endpoint that accepts POST requests with an array as a JSON body and then returns the normalized (to the sum) array. Of course, having a worker only to perform such a task is not efficient, but is a great example of showing how a very simple worker can work:

  • receiving user-data
  • responding with modified data

The first thing to do is to route our worker’s traffic, for example, we want the /info endpoint to return worker information and a /normalize endpoint to return the normalized data. To do that, we will use itty-router which you can install with npm install itty-router . This router was built for cf workers, thus fitting perfectly with our goal.

The first thing to note is that in line 32, we listen for the fetch event we discussed in the introductory paragraph and respond with the output of the router. In line 12, we create the router, which we will then use to build URL and method patterns to match the request. Notice the /info GET endpoint at line 15, and the wildcard endpoint to return a 404 error for all methods at line 29.

The /normalize endpoint only accepts post requests, and its callback function uses the request event as a parameter. In line 21, we read the request’s JSON body and then pass the body.array (which is the array we will send along with the request) to the normalized() function. Lastly, in line 24, we return a Response object with the normalized array.

Go ahead and test it by running wrangler dev to launch the worker on localhost, and craft a post request with the following JSON body:

{"array":[1,2,3,4,5]}

I will use httpie (as I explain in this article):

~/.../serverlessworker ❯ http localhost:8787/normalize array=[1,2,3,4,5]HTTP/1.1 200 OK
...
{
"array": [
0.0625,
0.125,
0.1875,
0.25,
0.3125
]
}

Now it’s time to deploy our worker to the Cloudflare network, and unless you need lower latency or other custom features, you’ll be able to do that for free.

This is the easiest step, in fact, you only need to do two things:

  1. log in to Cloudflare with wrangler: wrangler login
  2. publish the worker: within the worker’s root directory, wrangler publish
~/.../medium/serverlessworker ❯ wrangler publish
Delegating to locally-installed version of wrangler @ v2.0.17
⛅️ wrangler 2.0.17 (update available 2.0.19)
-------------------------------------------------------
Retrieving cached values for account from node_modules/.cache/wrangler
Total Upload: 1.61 KiB / gzip: 0.88 KiB
Uploaded serverlessworker (2.73 sec)
Published serverlessworker (6.65 sec)
serverlessworker.tdep.workers.dev

Now you’re all set, if you wish, you can try out my worker and normalize an array with the following httpie command:

http serverlessworker.tdep.workers.dev/normalize array=[1,2,3,4,5]

Leave a Comment