Integrate via npm
Installation with npm
Prerequisites
DATADOME_SERVER_SIDE_KEY
available in your DataDome dashboard,DATADOME_CLIENT_SIDE_KEY
available in your DataDome dashboard,- a Worker project.
Protect your traffic
-
Inside your existing Worker project directory, run the command:
npm install @datadome/module-cloudflare-worker
yarn add @datadome/module-cloudflare-worker
-
Replace your existing src/index.ts (or main worker file) with:
import { activateDataDome } from '@datadome/module-cloudflare-worker'; // Your custom handler const myHandler: ExportedHandler<Env> = { async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> { // Your custom logic here return new Response('Hello from my worker!', { headers: { 'content-type': 'text/plain' }, }); }, }; // The handler that wires up DataDome with env-based config const handler: ExportedHandler<Env> = { async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> { // Pass env values to DataDome at runtime const dataDomeHandler = activateDataDome(myHandler, { serverSideKey: env.DATADOME_SERVER_SIDE_KEY, clientSideKey: env.DATADOME_CLIENT_SIDE_KEY, // ...other options }); if (dataDomeHandler.fetch) { // Cast the request to the type expected by DataDome return dataDomeHandler.fetch(request as any, env, ctx); } return new Response('Handler not available', { status: 500 }); }, }; // Export as the worker's default export export default handler;
-
Create a Cloudflare Worker secret named
DATADOME_SERVER_SIDE_KEY
:
wrangler secret put DATADOME_SERVER_SIDE_KEY
A prompt appears. Enter the value of your DataDome server-side key you can find in your DataDome Dashboard.
- Create a Cloudflare Worker secret named
DATADOME_CLIENT_SIDE_KEY
:
wrangler secret put DATADOME_CLIENT_SIDE_KEY
A prompt appears. Enter the value of your DataDome client-side key you can find in your DataDome Dashboard.
- Deploy your updated Worker:
wrangler deploy
Congrats! You can now see your traffic in your DataDome dashboard.
Configuration
The configuration is done inside the code, using constants.
Server-side settings
Setting name | Description | Required | Default value | Example |
---|---|---|---|---|
serverSideKey | Your DataDome server-side key, found in your dashboard. | Yes | - | - |
timeout | Request timeout to DataDome API, in milliseconds. | No | 300 | 350 |
URLPatternExclusion | Regular expression to exclude URLs from the DataDome analysis. | No | List of excluded static assets below | - |
URLPatternInclusion | Regular expression to only include URLs in the DataDome analysed traffic. | No | - | /login*/i |
IPExclusion | List of IPs which traffic will be excluded from the DataDome analysis. | No | - | ["192.168.0.1", "192.168.0.2"] |
logpushConfiguration | List of Enriched headers names to log inside Logpush. | No | - | ["X-DataDome-botname", "X-DataDome-captchapassed", "X-DataDome-isbot"] |
enableGraphQLSupport | Extract GraphQL operation name and type on request to a /graphql endpoint to improve protection. | No | false | true |
enableDebugging | Log in Workers logs detailed information about the DataDome process. | No | false | true |
/\.(avi|flv|mka|mkv|mov|mp4|mpeg|mpg|mp3|flac|ogg|ogm|opus|wav|webm|webp|bmp|gif|ico|jpeg|jpg|png|svg|svgz|swf|eot|otf|ttf|woff|woff2|css|less|js|map)$/i
Client-side settings
Setting name | Description | Required | Default value | Example |
---|---|---|---|---|
clientSideKey | Your DataDome client-side key, found in your dashboard. | Yes | - | - |
jsURL | URL of the DataDome JS tag that can be changed to include the tag as a first party. | No | https://js.datadome.co/tags.js | https://ddfoo.com/tags.js |
jsEndpoint | Endpoint of the DataDome JS Tag. | No | ||
jsTagOptions | SON object describing DataDome JS Tag options. | No | { "ajaxListenerPath": true } | { "ajaxListenerPath": "example.com", "allowHtmlContentTypeOnCaptcha": true } |
jsURLRegexExclusion | Regular expression to NOT set the DataDome JS Tag on matching URLs. | No | - | - |
jsURLRegexInclusion | Regular expression to set the DataDome JS Tag on matching URLs. | No | - | /login*/i |
Update
- Run
npm update @datadome/module-cloudflare-worker
- Deploy with Wrangler:
wrangler deploy
Uninstallation
- Remove this line from your worker file:
import { activateDataDome } from '@datadome/module-cloudflare-worker'
- Replace this:
With your original handler:
const handler = activateDataDome(myHandler, options);
const handler = myHandler;
- Delete these from your
wrangler.toml
:[vars] DATADOME_SERVER_SIDE_KEY = "..." DATADOME_CLIENT_SIDE_KEY = "..."
- Uninstall the package
npm uninstall @datadome/module-cloudflare-worker
Logging
DataDome custom logging
- Inside your
wrangler.toml
file add:[observability.logs] enabled = true
- Deploy with
wrangler deploy
- By default, DataDome logs errors only (such as errors in the configuration). If you want to have detailed logs for debugging, you can modify the DataDome Worker script and set
DATADOME_ENABLE_DEBUGGING
totrue
.
DataDome logs format
The DataDome custom logs have the following format:
{
"step": "string",
"result": "string",
"reason": "string",
"details": {
"key": "value"
},
"company": "DataDome",
"line": 123
}
Logpush
You can use Logpush to send logs to a destination supported by Logpush (Datadog, Splunk, S3 Bucket…).
Cloudflare plan
Logpush is available to customers on Cloudflare’s Enterprise plan.
Update the Worker’s script
- Fill the
DATADOME_LOGPUSH_CONFIGURATION
value with the name of the values you want, as an Array of Strings.
The possible values are available in the Enriched headers page.
Eg:
DATADOME_LOGPUSH_CONFIGURATION = "["X-DataDome-botname", "X-DataDome-isbot", "x-datadomeresponse"]"
Enable Logpush
- Inside your
wrangler.toml
file add:
logpush = true
- Deploy with wrangler.
Updated about 17 hours ago