Integrate via Terraform

Installation with Terraform

Prerequisites

Protect your traffic

  1. Download the latest version of our Cloudflare Worker script.
  2. Create an empty terraform file, for example datadome_worker.tf and paste the following code:
terraform {
  required_providers {
    cloudflare = {
      source = "cloudflare/cloudflare"
      version = "~> 4"
    }
  }
}

provider "cloudflare" {
  api_token = "<CLOUDFLARE_API_TOKEN>" # update value
}

variable "datadome_server_side_key" {}
variable "datadome_client_side_key" {}

resource "cloudflare_worker_route" "catch_all_route" {
  zone_id = "<CLOUDFLARE_ZONE_ID>" # update value
  pattern = "<CLOUDFLARE_ROUTE_PATTERN>"# update value https://developers.cloudflare.com/workers/configuration/routing/routes/
  script_name = cloudflare_worker_script.datadome_script.name
}

resource "cloudflare_worker_script" "datadome_worker" {
  account_id = "<CLOUDFLARE_ACCOUNT_ID>" # update value
  name       = "datadome_worker"
  content    = file("<PATH_TO_THE_DATADOME.TS_FILE")  # update value

  	  secret_text_binding {
	    name = "DATADOME_SERVER_SIDE_KEY"
	    text = var.datadome_server_side_key
	  }

  	  secret_text_binding {
	    name = "DATADOME_CLIENT_SIDE_KEY"
	    text = var.datadome_client_side_key
	  }

}
  1. Update it with your personal values.
  2. Create the secret for datadome_server_side_key to hold the value of your DataDome server-side key you can find in your DataDome Dashboard with:
export TF_VAR_datadome_server_side_key=<YOUR DATADOME_SERVER_SIDE_KEY>
  1. Create the secret for datadome_client_side_key to hold the value of your DataDome client-side key you can find in your DataDome Dashboard with:
export TF_VAR_datadome_client_side_key=<YOUR DATADOME_CLIENT_SIDE_KEY>
  1. Run
terraform init
  1. Run
terraform plan

Two ressources will be created: the Worker script and the Worker route.

  1. Run
terraform apply

Congrats! You can now see your traffic in your DataDome dashboard.

Configuration

The configuration is done inside the script, using constants.

Server-side settings

Setting name in Worker's codeSetting name in packageDescriptionRequiredDefault valueExample
DATADOME_SERVER_SIDE_KEYserverSideKeyYour DataDome server-side key, found in your dashboard.Yes--
DATADOME_TIMEOUTtimeoutRequest timeout to DataDome API, in milliseconds.No300350
DATADOME_URL_PATTERN_EXCLUSIONurlPatternExclusionRegular expression to exclude URLs from the DataDome analysis.NoList of excluded static assets below-
DATADOME_URL_PATTERN_INCLUSIONurlPatternInclusionRegular expression to only include URLs in the DataDome analysed traffic.No-/login*/i
DATADOME_IP_EXCLUSIONipExclusionList of IPs which traffic will be excluded from the DataDome analysis.No-["192.168.0.1", "192.168.0.2"]
DATADOME_LOGPUSH_CONFIGURATIONlogpushConfigurationList of Enriched headers names to log inside Logpush.No-["X-DataDome-botname", "X-DataDome-captchapassed", "X-DataDome-isbot"]
DATADOME_ENABLE_GRAPHQL_SUPPORTenableGraphQLSupportExtract GraphQL operation name and type on request to a /graphql endpoint to improve protection.Nofalsetrue
DATADOME_ENABLE_DEBUGGINGenableDebuggingLog in Workers logs detailed information about the DataDome process.Nofalsetrue
 /\.(avi|flv|mka|mkv|mov|mp4|mpeg|mpg|mp3|flac|ogg|ogm|opus|wav|webm|webp|bmp|gif|ico|jpeg|jpg|png|svg|svgz|swf|eot|otf|ttf|woff|woff2|css|less|js|map)$/i

Client-side settings

Setting name in Worker's codeDescriptionRequiredDefault valueExample
DATADOME_CLIENT_SIDE_KEYclientSideKeyYour DataDome client-side key, found in your dashboard.Yes--
DATADOME_JS_URLjsUrlURL of the DataDome JS tag that can be changed to include the tag as a first party.Nohttps://js.datadome.co/tags.jshttps://ddfoo.com/tags.js
DATADOME_JS_ENDPOINTjsEndpointEndpoint of the DataDome JS Tag.No
DATADOME_JS_TAG_OPTIONSjsTagOptionsSON object describing DataDome JS Tag options.No{ "ajaxListenerPath": true }{ "ajaxListenerPath": "example.com", "allowHtmlContentTypeOnCaptcha": true }
DATADOME_JS_URL_REGEX_EXCLUSIONjsUrlRegexExclusionRegular expression to NOT set the DataDome JS Tag on matching URLs.No--
DATADOME_JS_URL_REGEX_INCLUSIONjsUrlRegexInclusionRegular expression to set the DataDome JS Tag on matching URLs.No-/login*/i

Update with Terraform

  1. Download the latest version of our Cloudflare Worker script.
  2. Paste the content of the datadome.ts file into the file used for the content of the script of your Worker.
  3. Run
terraform plan

1 ressource will be changed.

  1. Run
terraform apply

Uninstallation with Terraform

To delete the DataDome Worker and its script, run from the location of your datadome_worker.tf and terraform.tfstate:

terraform destroy -target cloudflare_worker_script.datadome_worker

Logging

DatDome custom logging

  1. Inside the Cloudflare Dashboard, go to the DataDome Worker's page.
  2. Click onSettings, go to the Observability section.
  3. Click on the pen icon next to Workers Logs.
  1. Enable logs.
  2. Click on Deploy.
  3. You will see the logs inside the Logs tab.
    By default, DataDome logs errors only (such as errors in the configuration). If you want to have detailed logs for debugging, you can set DATADOME_ENABLE_DEBUGGINGto true.

DataDome logs format

The DataDome custom logs have the following format:

{
  "step": "string",
  "result": "string", 
  "reason": "string",
  "details": {
    "key": "value"
  },
  "company": "DataDome",
  "line": 123
}

Logpush

You can use Logpush to send logs to a destination supported by Logpush (Datadog, Splunk, S3 Bucket…).

❗️

Cloudflare plan

Logpush is available to customers on Cloudflare’s Enterprise plan.

Update the Worker’s script

  1. Fill the DATADOME_LOGPUSH_CONFIGURATION value with the name of the values you want, as an Array of Strings.
    The possible values are available in the Enriched headers page.

Eg:

DATADOME_LOGPUSH_CONFIGURATION = "["X-DataDome-botname", "X-DataDome-isbot", "x-datadomeresponse"]"

Enable Logpush

  1. Inside the Cloudflare Dashboard, go to the DataDome Worker's page.
  2. Click onSettings, go to the Observability section.
  3. Click on Enable next to Logpush.