Integrate via Wrangler

Installation with Wrangler

Prerequisites

Protect your traffic

  1. Download the latest version of our Cloudflare Worker script.
  2. Copy the file datadome.tsinto your Wrangler project.
  3. Create a Cloudflare Worker secret named DATADOME_SERVER_SIDE_KEY:
wrangler secret put DATADOME_SERVER_SIDE_KEY

A prompt appears. Enter the value of your DataDome server-side key you can find in your DataDome Dashboard.

  1. Create a Cloudflare Worker secret named DATADOME_CLIENT_SIDE_KEY:
wrangler secret put DATADOME_CLIENT_SIDE_KEY

A prompt appears. Enter the value of your DataDome client-side key you can find in your DataDome Dashboard.

  1. Update your wrangler.toml file:
  • Change the value of main with the path to the datadome.ts file,
  • Add the route key to configure the domains and routes you want to set the DataDome Worker on.
name = "datadome-worker"
main = "src/datadome.ts"
route = { pattern = "example.org/*", zone_name = "example.org" }
  1. Deploy your Worker (you will be asked to authenticate with Cloudflare if you are not logged in already):
wrangler deploy

Congrats! You can now see your traffic in your DataDome dashboard.

Configuration

The configuration is done inside the script, using constants.

Server-side settings

Setting name in Worker's codeDescriptionRequiredDefault valueExample
DATADOME_SERVER_SIDE_KEYYour DataDome server-side key, found in your dashboard.Yes--
DATADOME_TIMEOUTRequest timeout to DataDome API, in milliseconds.No300350
DATADOME_URL_PATTERN_EXCLUSIONRegular expression to exclude URLs from the DataDome analysis.NoList of excluded static assets below-
DATADOME_URL_PATTERN_INCLUSIONRegular expression to only include URLs in the DataDome analysed traffic.No-/login*/i
DATADOME_IP_EXCLUSIONList of IPs which traffic will be excluded from the DataDome analysis.No-["192.168.0.1", "192.168.0.2"]
DATADOME_LOGPUSH_CONFIGURATIONList of Enriched headers names to log inside Logpush.No-["X-DataDome-botname", "X-DataDome-captchapassed", "X-DataDome-isbot"]
DATADOME_ENABLE_GRAPHQL_SUPPORTExtract GraphQL operation name and type on request to a /graphql endpoint to improve protection.Nofalsetrue
DATADOME_ENABLE_DEBUGGINGLog in Workers logs detailed information about the DataDome process.Nofalsetrue
 /\.(avi|flv|mka|mkv|mov|mp4|mpeg|mpg|mp3|flac|ogg|ogm|opus|wav|webm|webp|bmp|gif|ico|jpeg|jpg|png|svg|svgz|swf|eot|otf|ttf|woff|woff2|css|less|js|map)$/i

Client-side settings

Setting name in Worker's codeDescriptionRequiredDefault valueExample
DATADOME_CLIENT_SIDE_KEYYour DataDome client-side key, found in your dashboard.Yes--
DATADOME_JS_URLURL of the DataDome JS tag that can be changed to include the tag as a first party.Nohttps://js.datadome.co/tags.jshttps://ddfoo.com/tags.js
DATADOME_JS_ENDPOINTEndpoint of the DataDome JS Tag.No
DATADOME_JS_TAG_OPTIONSSON object describing DataDome JS Tag options.No{ "ajaxListenerPath": true }{ "ajaxListenerPath": "example.com", "allowHtmlContentTypeOnCaptcha": true }
DATADOME_JS_URL_REGEX_EXCLUSIONRegular expression to NOT set the DataDome JS Tag on matching URLs.No--
DATADOME_JS_URL_REGEX_INCLUSIONRegular expression to set the DataDome JS Tag on matching URLs.No-/login*/i

Update with Wrangler

  1. Download the latest version of our Cloudflare Worker script.
  2. Retrieve the specific values that you may have set to configure your DataDome Worker that are not the default value.
  3. Paste the content of the datadome.ts file into the file used as the main source for the script of your Worker project.
  4. Configure the settings with the values from step 2.
  5. Deploy with Wrangler:
wrangler deploy

Uninstallation with Wrangler

To delete the DataDome Worker, run from your Worker project:

wrangler delete

Logging

DataDome custom logging

  1. Inside your wrangler.toml file add:
    [observability.logs]
    enabled = true
    
  2. Deploy with
    wrangler deploy
    
  3. By default, DataDome logs errors only (such as errors in the configuration). If you want to have detailed logs for debugging, you can modify the DataDome Worker script and set DATADOME_ENABLE_DEBUGGINGto true.

DataDome logs format

The DataDome custom logs have the following format:

{
  "step": "string",
  "result": "string", 
  "reason": "string",
  "details": {
    "key": "value"
  },
  "company": "DataDome",
  "line": 123
}

Logpush

You can use Logpush to send logs to a destination supported by Logpush (Datadog, Splunk, S3 Bucket…).

❗️

Cloudflare plan

Logpush is available to customers on Cloudflare’s Enterprise plan.

Update the Worker’s script

  1. Fill the DATADOME_LOGPUSH_CONFIGURATION value with the name of the values you want, as an Array of Strings.
    The possible values are available in the Enriched headers page.

Eg:

DATADOME_LOGPUSH_CONFIGURATION = "["X-DataDome-botname", "X-DataDome-isbot", "x-datadomeresponse"]"

Enable Logpush

  1. Inside your wrangler.toml file add:
logpush = true
  1. Deploy with wrangler.