2.1K members online now
2.1K members online now
For questions related to Google Shopping and Merchant Center. Learn to optimize your Shopping ads
Guide Me

Feed items disapproved due to crawling restrictions

Visitor ✭ ✭ ✭
# 1
Visitor ✭ ✭ ✭

Hi all,


We have an issue with our Google Shopping campaigns in that the majority of items are disapproved by Google due to the issue of robots.txt blocking the Googlebot. Two other bots are allowed: Mediapartners and Adsbot. The reason the Googlebot is disallowed in the robots.txt is that that the client is a big retailer and has 100k of pages depending on what product the end users customsies on the website. Is there anyway of allowing the merchant center to access these "customise" URLs in the feed without being crawled by Googlebot?



1 Expert replyverified_user

Re: Feed items disapproved due to crawling restrictions

Top Contributor
# 2
Top Contributor
Hi Daniel
Allowing AdsBot-Google should work just fine.
You can find a list of the user agents here:

Can you share your Robots.txt entry for AdsBot-Google?

On a separate note, you can use the webmaster tools/webmaster consiole to tell Google's organic crawler not to crawl URLs with a certian parameter on them (likely how your site is doing it from your description).

Finally, if all else fails, you don't have to have just one robots.txt. You can have multiple versions. One per directory. You can put your feed in a separate directory and allow access to it using a directory level robots.txt.