Integrate via npm

Installation with npm

Prerequisites

Protect your traffic

  1. Inside your existing Worker project directory, run the command:

    npm install @datadome/module-cloudflare-worker
    
    yarn add @datadome/module-cloudflare-worker
    
  2. Replace your existing src/index.ts (or main worker file) with:

    import { activateDataDome } from '@datadome/module-cloudflare-worker';
    
    // Your custom handler
    const myHandler: ExportedHandler<Env> = {
        async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
            // Your custom logic here
            return new Response('Hello from my worker!', {
                headers: { 'content-type': 'text/plain' },
            });
        },
    };
    
    // The handler that wires up DataDome with env-based config
    const handler: ExportedHandler<Env> = {
        async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
            // Pass env values to DataDome at runtime
            const dataDomeHandler = activateDataDome(myHandler, {
                serverSideKey: env.DATADOME_SERVER_SIDE_KEY,
                clientSideKey: env.DATADOME_CLIENT_SIDE_KEY,
                // ...other options
            });
    
            if (dataDomeHandler.fetch) {
                // Cast the request to the type expected by DataDome
                return dataDomeHandler.fetch(request as any, env, ctx);
            }
    
            return new Response('Handler not available', { status: 500 });
        },
    };
    
    // Export as the worker's default export
    export default handler;
    
  3. Create a Cloudflare Worker secret named DATADOME_SERVER_SIDE_KEY:

wrangler secret put DATADOME_SERVER_SIDE_KEY

A prompt appears. Enter the value of your DataDome server-side key you can find in your DataDome Dashboard.

  1. Create a Cloudflare Worker secret named DATADOME_CLIENT_SIDE_KEY:
wrangler secret put DATADOME_CLIENT_SIDE_KEY

A prompt appears. Enter the value of your DataDome client-side key you can find in your DataDome Dashboard.

  1. Deploy your updated Worker:
    wrangler deploy
    

Congrats! You can now see your traffic in your DataDome dashboard.

Configuration

The configuration is done inside the code, using constants.

Server-side settings

Setting nameDescriptionRequiredDefault valueExample
serverSideKeyYour DataDome server-side key, found in your dashboard.Yes--
timeoutRequest timeout to DataDome API, in milliseconds.No300350
URLPatternExclusionRegular expression to exclude URLs from the DataDome analysis.NoList of excluded static assets below-
URLPatternInclusionRegular expression to only include URLs in the DataDome analysed traffic.No-/login*/i
IPExclusionList of IPs which traffic will be excluded from the DataDome analysis.No-["192.168.0.1", "192.168.0.2"]
logpushConfigurationList of Enriched headers names to log inside Logpush.No-["X-DataDome-botname", "X-DataDome-captchapassed", "X-DataDome-isbot"]
enableGraphQLSupportExtract GraphQL operation name and type on request to a /graphql endpoint to improve protection.Nofalsetrue
enableDebuggingLog in Workers logs detailed information about the DataDome process.Nofalsetrue
 /\.(avi|flv|mka|mkv|mov|mp4|mpeg|mpg|mp3|flac|ogg|ogm|opus|wav|webm|webp|bmp|gif|ico|jpeg|jpg|png|svg|svgz|swf|eot|otf|ttf|woff|woff2|css|less|js|map)$/i

Client-side settings

Setting nameDescriptionRequiredDefault valueExample
clientSideKeyYour DataDome client-side key, found in your dashboard.Yes--
jsURLURL of the DataDome JS tag that can be changed to include the tag as a first party.Nohttps://js.datadome.co/tags.jshttps://ddfoo.com/tags.js
jsEndpointEndpoint of the DataDome JS Tag.No
jsTagOptionsSON object describing DataDome JS Tag options.No{ "ajaxListenerPath": true }{ "ajaxListenerPath": "example.com", "allowHtmlContentTypeOnCaptcha": true }
jsURLRegexExclusionRegular expression to NOT set the DataDome JS Tag on matching URLs.No--
jsURLRegexInclusionRegular expression to set the DataDome JS Tag on matching URLs.No-/login*/i

Update

  1. Run
npm update @datadome/module-cloudflare-worker
  1. Deploy with Wrangler:
wrangler deploy

Uninstallation

  1. Remove this line from your worker file:
    import { activateDataDome } from '@datadome/module-cloudflare-worker'
    
  2. Replace this:
    const handler = activateDataDome(myHandler, options);
    
    With your original handler:
    const handler = myHandler;
    
  3. Delete these from your wrangler.toml:
    [vars]
    DATADOME_SERVER_SIDE_KEY = "..."
    DATADOME_CLIENT_SIDE_KEY = "..."
    
  4. Uninstall the package
    npm uninstall @datadome/module-cloudflare-worker
    

Logging

DataDome custom logging

  1. Inside your wrangler.toml file add:
    [observability.logs]
    enabled = true
    
  2. Deploy with
    wrangler deploy
    
  3. By default, DataDome logs errors only (such as errors in the configuration). If you want to have detailed logs for debugging, you can modify the DataDome Worker script and set DATADOME_ENABLE_DEBUGGINGto true.

DataDome logs format

The DataDome custom logs have the following format:

{
  "step": "string",
  "result": "string", 
  "reason": "string",
  "details": {
    "key": "value"
  },
  "company": "DataDome",
  "line": 123
}

Logpush

You can use Logpush to send logs to a destination supported by Logpush (Datadog, Splunk, S3 Bucket…).

❗️

Cloudflare plan

Logpush is available to customers on Cloudflare’s Enterprise plan.

Update the Worker’s script

  1. Fill the DATADOME_LOGPUSH_CONFIGURATION value with the name of the values you want, as an Array of Strings.
    The possible values are available in the Enriched headers page.

Eg:

DATADOME_LOGPUSH_CONFIGURATION = "["X-DataDome-botname", "X-DataDome-isbot", "x-datadomeresponse"]"

Enable Logpush

  1. Inside your wrangler.toml file add:
logpush = true
  1. Deploy with wrangler.