Skip to main content

Rule Gallery

Explore a curated collection of example configurations spanning from common to unconventional use-cases for the Traffic Policy module.

Deny non-GET requests:

This rule denies all inbound traffic that is not a GET request.

# snippet
---
inbound:
- expressions:
- "req.method != 'GET'"
actions:
- type: "deny"

Custom response for unauthorized requests

This rule sends a custom response with status code 401 and body Unauthorized for requests without an Authorization header.

# snippet
---
inbound:
- expressions:
- "!('authorization' in req.headers)"
actions:
- type: "custom-response"
config:
status_code: 401
content: "Unauthorized"

Rate limiting for specific endpoint:

This rule applies rate limiting of 30 requests per second to the endpoint /api/videos.

# snippet
---
inbound:
- expressions:
- "req.url.contains('/api/specific_endpoint')"
actions:
- type: "rate-limit"
config:
name: "Only allow 30 requests per minute"
algorithm: "sliding_window"
capacity: 30
rate: "60s"
bucket_key:
- "conn.client_ip"

User agent filtering

We deliver tailored content to Microsoft Edge users by examining the User-Agent header for the case-insensitive string (?i)edg/ succeeded by digits \d. To see how this works in practice, explore the following regex101 demonstration.

To ensure correct decoding from YAML/JSON, it's necessary to properly escape the \d sequence. In YAML, if your string is not enclosed in quotes, use a single escape: \\d. However, when your string is wrapped in quotes, either in YAML or JSON, you need to double-escape: \\\\d for accurate decoding.

# snippet
---
inbound:
- expressions:
- "'user-agent' in req.headers"
- "size(req.headers['user-agent'].filter(x,
x.matches('(?i).*Edg/\\\\d+.*'))) > 0"
actions:
- type: "custom-response"
config:
status_code: 200
content: "Hello Edge User!"

Add custom response for robots.txt

This rule returns a custom response for robots.txt, denying search engine crawlers on all paths. What is robots.txt?

# snippet
---
inbound:
- expressions:
- "req.url.contains('/robots.txt')"
actions:
- type: "custom-response"
config:
status_code: 200
content: "User-agent: *\r\nDisallow: /"
headers:
content-type: "text/plain"