Google Clarifies 15MB Googlebot Limit

The other day, I covered how Google add a line to its Google documentation that Googlebot can crawl the first 15MB of content in an HTML file or supported text-based file after that it stops crawling.

Then I was a bit shocked to see a large number of SEOs begin to panic.

For some reason, SEOs felt 15MB of raw HTML per page is not enough. 15MB is a massive amount of HTML on a URL by URL basis.

It does not include downloading videos, images, etc, it is just the HTML source code.

Again, it is a huge limit and none of this was new, it was simply just added to the documentation but has been in place at Google for a long time.

So Google's Gary Illyes did his thing to clarify and posted a nicely titled blog post on the Google blog named Googlebot and the 15 MB thing.

In short, Gary explains "You, dear reader, are unlikely to be the owner of one, since the median size of a HTML file is about 500 times smaller: 30 kilobytes (kB).

However, if you are the owner of an HTML page that's over 15 MB, perhaps you could at least move some inline scripts and CSS dust to external files, pretty please.

He digs in more for those who are concerned, so go read it.


Thanks For Reading!

NEXT: Best Crypto for under $1 to Buy Now.

Click Here