Page Size Checker

Check if your page is within Google's 2MB crawl limit. Measure raw and compressed HTML size to ensure Googlebot can fully crawl your page.

What Is Google's 2MB Crawl Limit?

Google's Googlebot will only fetch the first 2MB of an HTML page. After reaching that threshold, the fetch stops entirely. The downloaded portion — the first 2MB of bytes — is passed to Google's indexing systems and the Web Rendering Service (WRS) as if it were the complete file. Any bytes beyond the cutoff are never fetched, never rendered, and never indexed.

This 2MB limit applies to the transferred (compressed) size, not the raw uncompressed HTML. Since Googlebot accepts both gzip and brotli encoding, the compressed size of your HTML document is what counts toward the limit. A 6MB raw HTML page that compresses to 1.2MB would still be fully crawled.

Referenced resources — CSS, JavaScript, images, fonts — are fetched separately by WRS with their own independent per-URL byte counter. They do not count toward the size of the parent HTML page.

Why Does the 2MB Limit Matter for SEO?

For most websites, 2MB of compressed HTML is enormous — the vast majority of pages are well under this limit. However, certain types of pages can approach or exceed it:

  • Large e-commerce category pages with hundreds of products rendered server-side
  • Pages with massive inline JSON-LD structured data or inline CSS/JavaScript
  • Single-page applications that embed all content in one HTML document
  • Long-form content pages like documentation or multi-chapter guides on a single URL

If critical content — product listings, important text, internal links — falls beyond the 2MB cutoff, Google will never see it. This can silently hurt your indexing and rankings without any obvious error in Google Search Console.

How to Reduce Your HTML Page Size

Enable Server Compression

Configure gzip or brotli compression on your web server. This is the single most impactful change — brotli typically achieves 15-25% better compression than gzip for HTML.

Move Inline Resources External

Move large blocks of inline CSS and JavaScript to external files. These are fetched separately and don't count toward the HTML page size limit.

Paginate Long Content

Break very large category pages or long lists into multiple pages. This distributes content across URLs and keeps each page well within limits.

Minify HTML

Remove unnecessary whitespace, comments, and redundant attributes from your HTML. While the compression savings are modest, every byte counts for very large pages.

Trim Inline JSON-LD

Large product catalog schemas embedded as JSON-LD can add significant size. Keep structured data concise and only include required properties.

Lazy-Load with JavaScript

Load non-critical content via JavaScript after initial render. Note: Google does execute JavaScript, but the initial HTML payload must still contain your most important content for reliable indexing.

Frequently Asked Questions

What is Google's 2MB crawl limit?

Google's Googlebot will only fetch the first 2MB of an HTML page's transferred (compressed) size. Any content beyond this 2MB threshold is not fetched, rendered, or indexed. This limit includes HTTP headers and applies to the compressed bytes sent over the wire.

Does the 2MB limit apply to compressed or uncompressed HTML?

The 2MB limit applies to the transferred (compressed) size. Since Googlebot accepts gzip and brotli compression, the compressed size of your HTML is what matters. A 5MB raw HTML file that compresses to 1.5MB would still be fully crawled.

Does the 2MB limit include CSS and JavaScript files?

No. CSS, JavaScript, and other referenced resources have their own separate per-URL byte counters. They do not count toward the 2MB limit of the parent HTML page. Each resource is fetched independently with its own size limit.

What happens to content after the 2MB cutoff?

Any bytes beyond the 2MB threshold are entirely ignored by Google. They are not fetched, not rendered, and not indexed. The first 2MB is passed to Google's indexing systems and Web Rendering Service as if it were the complete file.

How can I check if my page exceeds Google's 2MB limit?

Use this Page Size Checker tool to enter any URL and instantly see the raw HTML size, gzip compressed size, and brotli compressed size. The tool shows whether your page is within Google's 2MB crawl limit and what percentage of the budget you've used.