AdAuth
AdAuth is an advertising-platform crawler operated by Adauth. It fetches landing pages to verify quality, policy compliance, and creative assets for the ads being served on the platform.
Blocking it almost always breaks ad serving on the affected URLs. If you advertise on this network, you need to allow this crawler.
If you are not running ads on this platform, the traffic should be modest. If it is heavy, check whether someone else is running ads pointed at your URLs.
See AdAuth on your own site
Match the User-Agent header on incoming requests against the pattern below.
regex
For higher confidence, also verify the source IP against the operator's published ranges. UA strings can be spoofed; IP ownership is harder to fake.
Renders JavaScript
No
IP verification
User-Agent only
Crawl frequency
Variable
Honors robots.txt
Yes
Honors Crawl-delay
Varies
Should I let AdAuth through?
In most cases, yes. Required for ad serving and brand safety. Blocking causes fill-rate and revenue drops. If volume gets noisy, rate-limit it before you block it outright.
Does blocking AdAuth affect my Google rankings?
No. AdAuth is not a search-engine crawler. Your ranking on Google or Bing is unaffected by what you do here.
How do I confirm a request is really from AdAuth?
Look at the User-Agent header in your access logs and match it against the strings listed above. Worth knowing that the User-Agent is easy to fake, so this check tells you "the traffic claims to be AdAuth", not "the traffic is genuinely AdAuth". If you need stronger guarantees, look for a reverse-DNS check or wait for Adauth to publish IP ranges.
What's the best way to understand what AdAuth is doing on my site?
Look at which URLs it hits, how often, and what time of day. The request pattern usually tells you whether it's building an index, watching for a specific change, or trying to pull data in bulk. The User-Agent name alone rarely tells the full story.
What's the cleanest way to control AdAuth?
Two layers. Robots.txt for the polite crawlers that read it, and rules at your CDN or edge for the ones that don't. Rankly's Agent Experience handles both from a single config, so you can allow, block, rate-limit, or serve a stripped-down version per bot. Agent Analytics handles the observation half so you know which bots are actually worth a rule.