Cloudflare Worker

DataDome Cloudflare integration detects and protects against bot activity.

This module is dedicated to be used on Cloudflare, using Workers.



Protect your traffic

  1. Connect to your Cloudflare console and go to the Workers & Pages section.
Cloudflare console tab.

Cloudflare console tab.

  1. Click on Create application.
Create application.

Create application.

  1. Click on Create Worker.
Create Worker.

Create Worker.

  1. Choose a name for the Worker, for example worker/datadome.js and click on Deploy.
Name Worker.

Name Worker.

  1. After your DataDome Worker has been deployed, click on Edit code.
  2. Download our Cloudflare Module and paste the code from datadome.js in the Script Editor.
Add DataDome script in the Script Editor.

Add DataDome script in the Script Editor.

  1. Fill the server-side key variable value (DATADOME_LICENSE_KEY) with the server-side key from your DataDome dashboard
  2. Fill the client-side key variable value (DATADOME_JS_KEY) with the client-side key from your DataDome dashboard.
  3. Click on Save and deploy.
Save and deploy Worker's script.

Save and deploy Worker's script.

  1. Confirm by clicking on Save and deploy in the popup window.
  2. Go back to the Worker overview.
  3. Inside the Triggers section, add your Custom Domains and/or Routes on which you want the DataDome Worker to be set. Refer to Cloudflare documentation on Domains and Routes.
Configure domains and routes.

Configure domains and routes.

Congrats! You can now see your traffic in your DataDome dashboard.


Configuration is done by changing DataDome variables directly inside the datadome.jsscript.


DATADOME_LICENSE_KEYYour DataDome server-side keyYes""
DATADOME_JS_KEYYour DataDome client-side keyOptional (but recommended)""
DATADOME_JS_TAG_OPTIONSJSON object describing JStag option (see here for more documentation)Optional'{ "ajaxListenerPath": true }’
DATADOME_TIMEOUTThe request timeout for the DataDome API, in millisecondsOptional300
DATADOME_URL_REGEXProcesses matching URLs onlyOptionalnull
DATADOME_URL_REGEX_EXCLUSIONIgnores all matching URLsOptionalnull
DATADOME_URI_REGEX_EXCLUSIONWill not send traffic associated with static assetsOptional/.(avi|flv|mka|mkv|mov|mp4|mpeg|mpg|mp3|flac|ogg|ogm|opus|wav|webm|webp|bmp|gif|ico|jpeg|jpg|png|svg|svgz|swf|eot|otf|ttf|woff|woff2|css|less|js|map)$/i
DATADOME_IP_FILTERINGWill not send server-side traffic associated to these IPs to DataDomeOptionalnull
DATADOME_JS_URLURL of the JS tag. Can be changed to include the tag as a first partyOptional''
DATADOME_JS_ENDPOINTURL of the JS tag endpointOptional""
DATADOME_ENABLE_GRAPHQL_SUPPORTExtract GraphQL operation name and type on request to a /graphql endpoint to improve protection.Optionalfalse

Caching policy

DataDome module doesn't change the default caching policy.

However, the module adds a tracking cookie on all requests, which may impact some custom policies.

You can use the Worker TTL feature to force a specific caching TTL.
Feel free to contact our support for any specific needs.


Can I enable DataDome only for specified IP?

Yes, you can. You need to update the code at the beginning of the function handleRequest similarly to the below:

async function handleRequest(request) {
  try {

    if (request.headers.get('cf-connecting-ip') != "" && request.headers.get('cf-connecting-ip') != "2606:4700:30::681b:938f") {
        return await fetch(request);

    const url = new URL(request.url);

DataDome will only process requests incoming from IP or 2606:4700:30::681b:938f.

How do I get DataDome logs using Logpush?

You can use Workers Trace Events with Logpush to send logs to a destination supported by Logpush (Datadog, Splunk, S3 Bucket…).


Logpush is available to customers on Cloudflare’s Enterprise plan.


Logging from workers is a feature that is available for workers only using version 1.14 (or later) of our module. It’s not possible to get logs from our app.

This can be done only through Cloudflare’s API.

1. Gather your credentials

  • X-Auth-Email: the email address of your account
  • X-Auth-Key: the value of Global API Key from My ProfileAPI tokens
Global API Key inside My Profile - API tokens

Global API Key inside My Profile - API tokens

  • ACCOUNT_ID: the value of the account ID seen in the Websites Overview page
Zone and Account ID inside Websites Overview page

Zone and Account ID inside Websites Overview page

  • <SERVICE_NAME>: the name of the existing service that holds the DataDome script
Service name inside Workers & Pages - Overview

Service name inside Workers & Pages - Overview

2. Configure Enriched Headers in DataDome Worker script

Fill the DATADOME_LOG_VALUES value with the names of Enriched Headers as an Array of Strings.


var DATADOME_LOG_VALUES = ["X-DataDome-botname", "X-DataDome-isbot", "x-datadomeresponse"];

3. Create a Logpush job to send data to your destination

  • Use the cURL command below to create a Logpush job.
    • Replace <ACCOUNT_ID>,<API_KEY>, <EMAIL> with values from step 1.
    • Set up the DESTINATION: follow the documentation here or find an example for sending logs to R2 at Cloudflare's documentation.
curl -X POST '<ACCOUNT_ID>/logpush/jobs' \
-H 'X-Auth-Key: <API_KEY>' \
-H 'X-Auth-Email: <EMAIL>' \
-H 'Content-Type: application/json' \
-d '{
"name": "datadome-logs",
"logpull_options": "fields=Event,EventTimestampMs,Outcome,Exceptions,Logs,ScriptName",
"destination_conf": "<DESTINATION>",
"dataset": "workers_trace_events",
"enabled": true
}'| jq .

4. Enable logging on DataDome Worker

  • Enable logging on your DataDome Worker by adding the property logpush = true to your wrangler.toml file.
# Top-level configuration

name = "<SERVICE_NAME>"
main = "src/index.js"
compatibility_date = "2022-07-12"

workers_dev = false
logpush = true
route = { pattern = "*", zone_name = "" }

An alternative is to set this property using cURL:

curl -X PUT "<ACCOUNT_ID>/workers/scripts/<SERVICE_NAME>" \
-H 'X-Auth-Key: <API_KEY>' \
-H 'X-Auth-Email: <EMAIL>' \
--form 'metadata={"main_module": "<SERVICE_NAME>.js", "logpush": true}' \
--form '"<SERVICE_NAME>.js"=@./<SERVICE_NAME>.js;type=application/javascript+module'

5. Receive data

Enriched Headers are now sent to your Logpush destination.
The output will look like


The information sent by DataDome is a Log Message, and is composed of the values set inDATADOME_LOG_VALUES, in the same order, separated by a semi-colon.

The - is set when the value is undefined.


How do I chain DataDome Worker with another Cloudflare Worker?

You can bind DataDome Worker with another service using Cloudflare's HTTP Service Bindings.

DataDome Worker calls Worker B

  1. Have a functioning Worker B. Script example for Worker B:
export default {
  async fetch(request, env, ctx) {
    return new Response("Hello World!");
  1. Bind your DataDome Worker to Worker B using the interface (Workers & Pages > service-name > Settings > Variables > Service Binding) or using Wrangler in the wrangler.toml file
name = "worker_datadome"
main = "worker.js"
services = [
  { binding = "WORKER_B", service = "worker_b" }
  1. Inside the DataDome Worker code, replaceactivateDataDome();with activateDataDome(globalThis.WORKER_B.fetch.bind(globalThis.WORKER_B));.

Worker A calls DataDome Worker

  1. Inside the DataDome Worker, modify the line activateDataDome(); with the fetch function you want to use.
  2. Bind Worker A to the DataDome Worker service, with the name DATADOME_WORKER using the interface (Workers & Pages > service-name > Settings > Variables > Service Binding) or using Wrangler in the wrangler.toml file.
  3. Call the DataDome Worker inside the Worker A code:
export default {  
  async fetch(request, env, ctx) {  
    return await env.DATADOME_WORKER.fetch(request);