Your ads.txt page must be able to be scanned programmatically by our exchange partners in order to be linked to their demand. Below are some reasons that your domain may not be able to be read programmatically and suggested fixes:
- Security Software is Blocking Bots:
If you are using security software to protect your site from bots, there can be many reasons for the blocks. Check your logs for the reason(s) for the block and add an exception for the ads.txt page so that it is readable to demand crawlers.
- Cloudflare Firewall Security Settings are too High:
If you are using Cloudflare, your security settings could be blocking the programmatic scanning of your ads.txt page. To remedy this, go to the IP Firewall settings for the ads.txt page and lower the security level to medium or high. See screen shot below:
***Additional information / fixes for Cloudflare Security Settings can be found here.
This link has details for both previous / new firewall WAF managed rules, and Cloudflare bot solutions.
- Robots.txt disallows crawling on your ads.txt page:
Make sure that robots.txt does NOT disallow crawling your ads.txt page AND does not disallow the user agent of a crawler:
Example: Crawling disallowed on ads.txt file
-
An ads.txt file is uploaded on
domain.com/ads.txt
. -
These lines are included in
domain.com/robots.txt
:User-agent: *
Disallow: /ads
-
The ads.txt file will be ignored by crawlers that respect the
robots.txt protocol. -
The robots.txt file can be modified to allow crawling
of the file (other methods are possible):-
Example 1: Modify disallowed path.
User-agent: *
Disallow: /ads/
-
Example 2: Explicitly allow ads.txt; depends on crawler support
for theAllow
robots.txt directive.User-agent: *
Allow: /ads.txt
Disallow: /ads
-
Example: Crawling disallowed for User Agent(s)
-
An ads.txt file is uploaded on
domain.com/ads.txt
. -
These lines are included in
domain.com/robots.txt
:User-agent: Googlebot
Disallow: /
-
This ads.txt file will be ignored by the Google
crawler ("Googlebot").
-
- Ads.txt file must be available on the root domain:
If your file is located on www.domain.com/ads.txt, it will only be crawled if it is being redirected from the root domain (domain.com/ads.txt). - Make sure your ads.txt file is on both http and https:
Crawlers attempt both protocols, so make sure your file is available in both places. - HTTP 200 Status Code OK:
If the request has succeeded, the ads.txt file should return a HTTP 200 Status Code OK response. If it does not, the response will be ignored or the file will be considered non-existent. -
Formatting is Incorrect:
If you use a text editor to compile your list, make sure to use one that allows you to save your ads.txt file with Unix/LF (Line Feed) line breaks. BBEdit for Mac and current versions of Window Notepad will support Unix/Linux line endings (LF).
Additional troubleshooting available on Google Adsense here.
Additional information about fixing these errors is available on Google Adsense here.
Comments
0 comments
Article is closed for comments.