If the number of crawled pages/resources is a lot lower, then you may consider using a robots.txt file to manage it and make sure that Google is only crawling important pages. very pleased with this update. Any data that Google share is huge because it gives us more room to […]