Cloudflare Worker

This module is dedicated to be used on Cloudflare, using Workers.

Installation

Prerequisites

Protect your traffic

  1. Connect to your Cloudflare console and go to the Workers & Pages section.
Cloudflare console tab.

Cloudflare console tab.

  1. Click on Create application.
Create application.

Create application.

  1. Click on Create Worker.
Create Worker.

Create Worker.

  1. Choose a name for the Worker, for example worker/datadome.js and click on Deploy.
Name Worker.

Name Worker.

  1. After your DataDome Worker has been deployed, click on Edit code.
  2. Download our Cloudflare Module and paste the code from datadome.js in the Script Editor.
Add DataDome script in the Script Editor.

Add DataDome script in the Script Editor.

  1. Fill the server-side key variable value (DATADOME_LICENSE_KEY) with the server-side key from your DataDome dashboard
  2. Fill the client-side key variable value (DATADOME_JS_KEY) with the client-side key from your DataDome dashboard.
  3. Click on Save and deploy.
Save and deploy Worker's script.

Save and deploy Worker's script.

  1. Confirm by clicking on Save and deploy in the popup window.
  2. Go back to the Worker overview.
  3. Inside the Triggers section, add your Custom Domains and/or Routes on which you want the DataDome Worker to be set. Refer to Cloudflare documentation on Domains and Routes.
Configure domains and routes.

Configure domains and routes.

Congrats! You can now see your traffic in your DataDome dashboard.

Configuration

Configuration is done by changing DataDome variables directly inside the datadome.jsscript.

Settings

SettingDescriptionRequireddefault
DATADOME_LICENSE_KEYYour DataDome server-side keyYes""
DATADOME_JS_KEYYour DataDome client-side keyOptional (but recommended)""
DATADOME_JS_TAG_OPTIONSJSON object describing JStag option (see here for more documentation)Optional'{ "ajaxListenerPath": true }’
DATADOME_TIMEOUTThe request timeout for the DataDome API, in millisecondsOptional300
DATADOME_URL_REGEXProcesses matching URLs onlyOptionalnull
DATADOME_URL_REGEX_EXCLUSIONIgnores all matching URLsOptionalnull
DATADOME_URI_REGEX_EXCLUSIONWill not send traffic associated with static assetsOptional/.(avi|flv|mka|mkv|mov|mp4|mpeg|mpg|mp3|flac|ogg|ogm|opus|wav|webm|webp|bmp|gif|ico|jpeg|jpg|png|svg|svgz|swf|eot|otf|ttf|woff|woff2|css|less|js|map)$/i
DATADOME_IP_FILTERINGWill not send server-side traffic associated to these IPs to DataDomeOptionalnull
DATADOME_JS_URLURL of the JS tag. Can be changed to include the tag as a first partyOptional'https://js.datadome.co/tags.js'
DATADOME_JS_ENDPOINTURL of the JS tag endpointOptional""
DATADOME_ENABLE_GRAPHQL_SUPPORTExtract GraphQL operation name and type on request to a /graphql endpoint to improve protection.Optionalfalse

Caching policy

DataDome module doesn't change the default caching policy.

However, the module adds a tracking cookie on all requests, which may impact some custom policies.

You can use the Worker TTL feature to force a specific caching TTL.
Feel free to contact our support for any specific needs.

FAQ

Can I enable DataDome only for specified IP?

Yes, you can. You need to update the code at the beginning of the function handleRequest similarly to the below:

async function handleRequest(request) {
  try {

    if (request.headers.get('cf-connecting-ip') != "1.2.3.4" && request.headers.get('cf-connecting-ip') != "2606:4700:30::681b:938f") {
        return await fetch(request);
    }

    const url = new URL(request.url);
...

DataDome will only process requests incoming from IP 1.2.3.4 or 2606:4700:30::681b:938f.

How do I get DataDome logs using Logpush?

You can use Workers Trace Events with Logpush to send logs to a destination supported by Logpush (Datadog, Splunk, S3 Bucket…).

❗️

Logpush is available to customers on Cloudflare’s Enterprise plan.

🚧

Logging from workers is a feature that is available for workers only using version 1.14 (or later) of our module. It’s not possible to get logs from our app.

This can be done only through Cloudflare’s API.

Gather your credentials

You will need
X-Auth-Email : the email address of your account

X-Auth-Key: the value of “Global API Key” from “My Profile” → “API tokens”

Global API Key inside My Profile - API tokens

Global API Key inside My Profile - API tokens

ACCOUNT_ID : the value of the account id seen in the Websites Overview page

Zone and Account ID inside Websites Overview page

Zone and Account ID inside Websites Overview page

SERVICE_NAME: the name of the EXISTING service in which you set up DataDome

Service name inside Workers & Pages - Overview

Service name inside Workers & Pages - Overview

Update the worker’s script

Fill the DATADOME_LOG_VALUES value with the name of the values you want, as a Strings Array.
The possible values are:
X-DataDome-isbot, X-DataDome-botname, X-DataDome-captchapassed, X-DataDome-ruletype, X-DataDome-requestid, X-DataDome-matchedmodels, x-datadomeresponse, x-dd-type, X-DataDome-score and X-DataDome-Traffic-Rule-Response

Eg:

var DATADOME_LOG_VALUES = ["X-DataDome-botname", "X-DataDome-isbot", "x-datadomeresponse"];

Create a Logpush trace worker

Trace Workers are used to send logs to Logpush.

The “service” field should be set to the name of either your Worker and its environment (the default is production).

curl -X POST \
-H "X-Auth-Email: <EMAIL>" \
-H "X-Auth-Key: <AUTH_KEY>" \
"https://api.cloudflare.com/client/v4/accounts/<ACCOUNT_ID>/workers/traces" \
-d '{
"producer": {
"service": "<SERVICE_NAME>"
},
"type": "log_push"
}' | jq .

Make sure the traces were created:

curl -X GET \
-H "X-Auth-Email: <EMAIL>" \
-H "X-Auth-Key: <AUTH_KEY>" \
"https://api.cloudflare.com/client/v4/accounts/277<ACCOUNT_ID>612/wor
kers/traces" | jq .

🚧

This step has to be done every time the service is deployed.

Create a job to send data to your destination

❗️

Worker trace events is an ACCOUNT-scoped dataset
Make sure that all the API requests you make are on client/v4/accounts and not on client/v4/zones

To set up your destination properly, you can follow the documentation here.
Do not forget that all actions have to be made to client/v4/accounts and not to client/v4/zones.

curl -s https://api.cloudflare.com/client/v4/accounts/<ACCOUNT_ID>/logpush/jobs
-X POST -d '
{
"name": "datadome-logs",
"logpull_options": "fields=Event,EventTimestampMs,Exceptions,Logs,ScriptName",
"destination_conf": "<YOUR_DESTINATION>",
"max_upload_bytes": 5000000,
"max_upload_records": 1000,
"dataset": "workers_trace_events",
"enabled": true
}' \
-H "X-Auth-Email: <EMAIL>" \
-H "X-Auth-Key: <API_KEY>"

Receive data

You will receive the logs at the destination you chose.
The output will look like

{"Event":{"RayID":"780a1e5f7b3f2a33","Request":{"URL":"https://mydomain.co/","Method":"GET"},"Response":{"Status":403}},"EventTimestampMs":1672228648886,"Exceptions":[],"Logs":[{"Level":"log","Message":["1;TestBlock;403"],"TimestampMs":1672228648902}],"ScriptName":"datadome"}

The information sent by DataDome is a Log Message, and is composed of the values set in DATADOME_LOG_VALUES, in the same order, separated by a semi-colon.

The - is set when the value is undefined.

"Logs":[{"Level":"log","Message":["TestBlock;1;403"],"TimestampMs":1672228648902}]