GoogleBot crawl issue
Googlebots are absolutely hammering our site and I dont know how to stop it. I have off-grid-europe.com/robots.txt, what is wrong with this? How do I stop this?
How do I get google just to normally craw the site, and not try and find every link possible without blocking my merchant feed from being crawled?
This is the advice given my google to prevent turning the bots away.
User-agent: Googlebot-Image Disallow: User-agent: Googlebot Disallow:
I also reduced the crawlrate to 30 seconds on webmaster tools which helped a bit but now the problem is back. And I have a sitemap to index in the robots.txt which really is all I want robots to index period.
This problem didnt always happen on the site, It began in August. Any idea why this is?
Also, I have been going around circles here
Regarding this problem, no solution