Snowflake-Cortex-Bot
Snowflake-Cortex-Bot indexes web pages for an AI-powered search product operated by Snowflake. Unlike a pure training crawler, AI search crawlers are designed to drive users back to the original source via citations and links.
The crawl pattern looks similar to a traditional search engine: regular, broad, and bounded by your robots.txt directives. The difference is that ranking is done by an LLM, not a classic ranking algorithm.
Allowing Snowflake-Cortex-Bot is generally how your site stays discoverable inside AI answer engines. The traffic it sends back is small but high-intent: users who clicked a citation usually wanted exactly what you wrote.
See Snowflake-Cortex-Bot on your own site
Match the User-Agent header on incoming requests against the pattern below.
regex
For higher confidence, also verify the source IP against the operator's published ranges. UA strings can be spoofed; IP ownership is harder to fake.
Renders JavaScript
No
IP verification
User-Agent only
Crawl frequency
Continuous
Honors robots.txt
Yes
Honors Crawl-delay
Yes
Should I let Snowflake-Cortex-Bot through?
In most cases, yes. AI search crawlers cite and link back. Allowing is how your content becomes discoverable inside AI answers. If volume gets noisy, rate-limit it before you block it outright.
Does blocking Snowflake-Cortex-Bot affect my Google rankings?
No. Snowflake-Cortex-Bot feeds Snowflake's AI answer engine, which is a separate distribution channel from classical search. Blocking it removes you from citations inside Snowflake's product, but Google and Bing keep ranking you the same.
How do I confirm a request is really from Snowflake-Cortex-Bot?
Look at the User-Agent header in your access logs and match it against the strings listed above. Worth knowing that the User-Agent is easy to fake, so this check tells you "the traffic claims to be Snowflake-Cortex-Bot", not "the traffic is genuinely Snowflake-Cortex-Bot". If you need stronger guarantees, look for a reverse-DNS check or wait for Snowflake to publish IP ranges.
How is Snowflake-Cortex-Bot different from Googlebot?
Both crawl the web, but they feed completely different surfaces. Googlebot powers Google Search, where you compete for ten blue links. Snowflake-Cortex-Bot powers Snowflake's AI answer engine, where you compete for one of a handful of citations in a written-out paragraph. The crawl mechanics are similar, the consumption pattern is not.
What's the cleanest way to control Snowflake-Cortex-Bot?
Two layers. Robots.txt for the polite crawlers that read it, and rules at your CDN or edge for the ones that don't. Rankly's Agent Experience handles both from a single config, so you can allow, block, rate-limit, or serve a stripped-down version per bot. Agent Analytics handles the observation half so you know which bots are actually worth a rule.