robots.txt and google crawl rate
Googlebots are absolutely hammering our site and I dont know how to stop it. I have off-grid-europe.com/robots.txt, what is wrong with this? How do I stop this?
How do I get google just to index the sitemap and not try and find every link possible without blocking my merchant feed from being crawled?
As you can see I allowed google bots to index my site
User-agent: Googlebot-Image Disallow: User-agent: Googlebot Disallow:
I also reduced the crawlrate to 30 seconds on webmaster tools which helped a bit but now the problem is back. And I have a sitemap to index in the robots.txt which really is all I want robots to index period.
This problem didnt always happen on the site, It began in August. Any idea why this is?
Re: robots.txt and google crawl rate
I think you'd get a better answer over on the Webmaster Central forum. This is the AdWords forum so the guys over here will be able to answer your question quicker and more thoroughly https://productforums.google.com/forum/#!forum/webmasters
Hope that helps.