Skip to content
Advanced seo

How to Use Chrome to View a Website as Googlebot

Alex Harford

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Table of Contents

Alex Harford

How to Use Chrome to View a Website as Googlebot

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Struggling to ensure Googlebot properly crawls and indexes your website? For technical SEOs, rendering issues—especially on JavaScript-heavy sites—can lead to missed rankings and hidden content.

That’s where using Chrome (or Chrome Canary) to emulate Googlebot comes in. This method uncovers discrepancies between what users and search engines see, ensuring your site performs as expected.

Whether spoofing Googlebot or not, with a specific testing browser, technical audits are more efficient and accurate.

In this guide, I’ll show you how to set up a Googlebot browser, troubleshoot rendering issues, and improve your SEO audits.

Why should I view a website as Googlebot?

In the past, technical SEO audits were simpler, with websites relying on HTML and CSS and JavaScript limited to minor enhancements like animations. Today, entire websites are built with JavaScript, shifting the workload from servers to browsers. This means that search bots, including Googlebot, must render pages client-side—a process that’s resource-intensive and prone to delays.

Search bots often struggle with JavaScript. Googlebot, for example, processes the raw HTML first and may not fully render JavaScript content until days or weeks later, depending on the website. Some sites use dynamic rendering to bypass these challenges, serving server-side versions for bots and client-side versions for users. 

Mini rant

Generally, this setup overcomplicates websites and creates more technical SEO issues than a server-side rendered or traditional HTML website. Thankfully, dynamically rendered websites are declining in use.

While exceptions exist, I believe client-side rendered websites are a bad idea. Websites should be designed to work on the lowest common denominator of a device, with progressive enhancement (through JavaScript) used to improve the experience for people using devices that can handle extras. 

My anecdotal evidence suggests that client-side rendered websites are generally more difficult for people who rely on accessibility solutions such as screen readers. Various studies back this up, though the studies I’ve seen are by companies and charities invested in accessibility (an example where I think any bias is perhaps justified for the good of all.) However there are instances where technical SEO and usability crossover.

The good news

Viewing a website as a Googlebot lets you detect discrepancies between what bots and users see. While these views don’t need to be identical, critical elements—like navigation and content must align. This approach helps identify indexing and ranking issues caused by rendering limitations and other search bot-speicific quirks.

Can we see what Googlebot sees?

No, not entirely. 

Googlebot renders webpages with a headless version of the Chrome browser, but even with the techniques in this article, it’s impossible to replicate its behavior perfectly. For example, Googlebot’s handling of JavaScript can be unpredictable. 

A notable bug in September 2024 prevented Google from detecting meta noindex tags in client-side rendered code for many React-based websites. Issues like these highlight the limitations of emulating Googlebot, especially for important SEO elements like tags and main content.

The goal, however, is to emulate Googlebot’s mobile-first indexing as closely as possible. For this, I use a combination of tools:

  • A Googlebot browser for direct emulation.

  • Screaming Frog SEO Spider to spoof and render as Googlebot.

  • Google’s tools like the URL Inspection tool in Search Console and Rich Results Test for screenshots and code analysis.

It’s worth noting that Google’s tools, especially after they switched to the “Google-InspectionTool” user-agent in 2023, aren’t entirely accurate representations of what Googlebot sees. However, when used alongside the Googlebot browser and SEO Spider, they’re valuable for identifying potential issues and troubleshooting.

Why use a separate browser to view websites as Googlebot?

Using a dedicated Googlebot browser simplifies technical SEO audits and improves the accuracy of your results. Here's why:

1. Convenience

A dedicated browser saves time and effort by allowing you to quickly emulate Googlebot without relying on multiple tools. Switching user agents in a standard browser extension can be inefficient, especially when auditing sites with inconsistent server responses or dynamic content.

Additionally, some Googlebot-specific Chrome settings don’t persist across tabs or sessions, and specific settings (e.g., disabling JavaScript) can interfere with other tabs you’re working on. You can bypass these challenges and streamline your audit process with a separate browser.

2. Improved accuracy

Browser extensions can unintentionally alter how websites look or behave. A dedicated Googlebot browser minimizes the number of extensions, reducing interference and ensuring a more accurate emulation of Googlebot’s experience.

3. Avoiding mistakes

It’s easy to forget to turn off Googlebot spoofing in a standard browser, which can cause websites to malfunction or block your access. I’ve even been blocked from websites for spoofing Googlebot and had to email them with my IP to remove the block.

4. Flexibility despite challenges

For many years, my Googlebot browser worked without a hitch. However, with the rise of Cloudflare and its stricter security protocols on e-commerce websites, I’ve often had to ask clients to add specific IPs to an allow list so I can test their sites while spoofing Googlebot.

When whitelisting isn’t an option, I switch to alternatives like the Bingbot or DuckDuckBot user-agent. It's a less reliable solution than mimicking Googlebot, but can still uncover valuable insights. Another fallback is checking rendered HTML in Google Search Console, which, despite its limitation of being a different user-agent to Google's crawler, remains a reliable way to emulate Googlebot behavior.

If I’m auditing a site that blocks non-Google Googlebots and can get my IPs allowed, the Googlebot browser is still my preferred tool. It’s more than just a user-agent switcher and offers the most comprehensive way to understand what Googlebot sees.

Which SEO audits are useful for a Googlebot browser?

The most common use case for a Googlebot browser is auditing websites that rely on client-side or dynamic rendering. It’s a straightforward way to compare what Googlebot sees to what a general visitor sees, highlighting discrepancies that could impact your site’s performance in search results.

Given I recommend limiting the number of browser extensions to an essential few, it’s also a more accurate test than an extension-loaded browser of how actual Chrome users experience a website, especially when using Chrome’s inbuilt DevTools and Lighthouse for speed audits, for example.

Even for websites that don’t use dynamic rendering, you never know what you might find by spoofing Googlebot. In over eight years of auditing e-commerce websites, I’m still surprised by the unique problems I encounter.

What should you investigate during a Googlebot audit?

  • Navigation differences: Is the main navigation consistent across user and bot views?
  • Content visibility: Is Googlebot able to see the content you want indexed?
  • JavaScript indexing delays: If the site depends on JavaScript rendering, will new content be indexed quickly enough to matter (e.g., for events or product launches)?
  • Server response issues: Are URLs returning correct server responses? For instance, an incorrect URL might show a 200 OK for Googlebot but a 404 Not Found for visitors.
  • Page layout variations: I’ve often seen links display as blue text on a black background when spoofing Googlebot. It’s machine-readable but far from user-friendly. If Googlebot can’t render your site properly, it won’t know what to prioritize. 
  • Geolocation-based redirects: Many websites redirect based on location. Since Googlebot crawls primarily from US IPs, it’s important to verify how your site handles such requests.

How detailed you go depends on the audit, but Chrome offers many built-in tools for technical SEO audits. For example, I often compare Console and Network tab data to identify discrepancies between general visitor views and Googlebot. This process catches files blocked for Googlebot or missing content that could otherwise go unnoticed.

Never miss a traffic impacting issue on your site

Find and fix technical SEO issues fast with Moz Pro.

How to set up your Googlebot browser

Googlebot simulator illustration

Setting up a Googlebot browser takes about 30 minutes and makes it much easier to view webpages as Googlebot. Here’s how to get started:

Step 1: Download and install Chrome or Canary

  • If Chrome isn’t your default browser, you can use it as your Googlebot browser.
  • If Chrome is your default browser, download and install Chrome Canary instead.

Canary is a development version of Chrome where Google tests new features. It runs separately from the default Chrome installation and is easily identified by its yellow icon—a nod to the canaries once used in mines to detect poisonous gases.

Screenshot of tabs

While Canary is labeled “unstable,” I haven’t encountered any issues using it as my Googlebot browser. In fact, it offers beta features that are useful for audits. If these features make it to Chrome, you’ll be ahead of the curve and can impress your non-Canary-using colleagues.

Step 2: Install browser extensions

To optimize your Googlebot browser, I recommend intalling five crucial extensions and a bookmarklet to optimize my Googlebot browser. These tools emulate Googlebot and improve technical SEO audits, with three especially useful for JavaScript-heavy websites. Here’s the breakdown:

Extensions for emulating Googlebot:

  • User-Agent Switcher: Switches the browser’s user-agent to mimic Googlebot’s behavior.
  • Web Developer: Allows you to turn JavaScript on or off easily, giving insight into how Googlebot might process the site.
  • Windscribe (or your preferred VPN): Simulates Googlebot’s location, typically in the US, ensuring location-based discrepancies are accounted for.

Additional favorites:

  • Link Redirect Trace: Quickly checks server responses, and HTTP headers for technical SEO audits.
  • View Rendered Source: Compares raw HTML (what the server delivers) with rendered HTML (what the browser processes).

Bookmarklet:

  • NoJS Side-by-Side: Compares a webpage’s appearance with and without JavaScript enabled, making discrepancies easier to spot.

Before we move on to step 3, I’ll break down these extensions I just mentioned

User-Agent Switcher extension

User-Agent Switcher does what it says on the tin: switches the browser’s user-agent. While Chrome and Canary include a built-in user-agent setting, it only applies to the active tab and resets when you close the browser. Using this extension ensures consistency across sessions.

I take the Googlebot user-agent string from Chrome’s browser settings, which, at the time of writing, was the latest version of Chrome (note that below, I’m taking the user-agent from Chrome and not Canary).

Setting up the User-Agent Switcher:

1.Get the Googlebot user-agent string:

  • Open Chrome DevTools by pressing F12 or going to More tools> Developer tools.
  • Navigate to the Network tab.
  • From the top-right Network hamburger menu, select More tools > Network conditions.
  • In the Network conditions tab:
    • Untick "Use browser default."
    • Choose "Googlebot Smartphone" from the list.
    • Copy and paste the user-agent from the field below the list into the User-Agent Switcher extension list (another screenshot below). Remember to switch Chrome to its default user-agent if it's your main browser.
  • An additional tip for Chrome users:
    • While you’re here, if Chrome will be your Googlebot browser, tick "Disable cache" in DevTools for more accurate results during testing.
       
DevTools Step by step

2. Add the user-agent to the extension:

  • Right-click the User-Agent Switcher icon in the browser toolbar and click Options (see screenshot below).
  •  “Indicator Flag” is the text in the browser toolbar that shows which user-agent you’ve selected. Paste the Googlebot user-agent string into the list and give it a label (e.g., "GS" for Googlebot Smartphone).
  • Optionally, add other user-agents like Googlebot Desktop, Bingbots, or DuckDuckBot for broader testing.
Google DevTools menu options

Why spoof Googlebot’s user-agent?

Web servers identify browsers through their user-agent strings. For example, the user-agent for a Windows 10 device using Chrome might look like this:

Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36

If you’re curious about the history of user-agent strings and why other browsers appear in Chrome’s user-agent, you might find resources like the History of the user-agent string an interesting read.

Web Developer extension

The Web Developer extension is an essential tool for technical SEOs, especially when auditing JavaScript-heavy websites. In my Googlebot browser, I regularly switch JavaScript on and off to mimic how Googlebot processes a webpage.

Why disable JavaScript?

Googlebot doesn’t execute all JavaScript on its initial crawl of a URL. To understand what it sees before rendering JavaScript, disable it. This reveals the raw HTML content and helps identify critical issues, such as missing navigation or content that relies on JavaScript to display.

By toggling JavaScript with this extension, you gain insights into how your site performs for search engines during the crucial first crawl.

Windscribe (or another VPN)

Windscribe, or any reliable VPN, is invaluable for emulating Googlebot’s typical US-based location. While I use a Windscribe Pro account, their free plan includes up to 2GB of monthly data and offers several US locations.

windsubscribe

Tips for using a VPN with your Googlebot browser:

  • Location doesn’t matter much: Googlebot mostly crawls from the US, so any US location works. For fun, I imagine Gotham as real (and villain-free).
  • Disable unnecessary settings: Windscribe’s browser extension blocks ads by default, which can interfere with how webpages render. Make sure the two icons in the top-right corner show a zero.
  • Use a browser extension over an app: A VPN extension ties the location spoofing to your Googlebot browser, ensuring your standard browsing isn’t affected.

These tools, paired with the User-Agent Switcher, enhance your ability to emulate Googlebot, revealing content discrepancies and potential indexing issues.

Why spoof Googlebot’s location?

Googlebot primarily crawls websites from US IPs, and there are several reasons to mimic this behavior when conducting audits:

  • Geolocation-based blocking: Some websites block access to US IPs, which means Googlebot can’t crawl or index them. Spoofing a US location ensures that you’re seeing the site as Googlebot would.
  • Location-specific redirects: Many websites serve different content based on location. For instance, a business might have separate sites for Asia and the US, with US visitors automatically redirected to the US site. In such cases, Googlebot might never encounter the Asian version, leaving it unindexed.

 

Other Chrome extensions useful for auditing JavaScript websites


Beyond the essentials like User-Agent Switcher and a VPN, here are a few more tools I rely on for technical audits:
 

  • Link Redirect Trace: Shows server responses and HTTP headers, helping troubleshoot technical issues.
  • View Rendered Source: Compares raw HTML (delivered by the server) to rendered HTML (processed by the browser), helping you spot discrepancies in what users and Googlebot see.
  • NoJS Side-by-Side bookmarklet: Allows you to compare a webpage with and without JavaScript enabled, displayed side by side in the same browser window.
     

Alright, back to step 3

Step 3: Configure browser settings to emulate Googlebot

Next, we’ll configure the Googlebot browser settings to match what Googlebot doesn’t support when crawling a website.

What Googlebot doesn’t support:

  • Service workers: Since users clicking through search results may not have visited the page before, Googlebot doesn’t cache data for later visits.
  • Permission requests: Googlebot does not process push notifications, webcam access, geolocation requests, and similar features. Therefore, any content relying on these permissions will not be visible to it.
  • Statefulness: Googlebot is stateless, meaning it doesn’t retain data like cookies, session storage, local storage, or IndexedDB. While these mechanisms can temporarily store data, they are cleared before Googlebot crawls the next URL.

These bullet points are summarized from an interview by Eric Enge with Google’s Martin Splitt.

Step 3a: DevTools settings

You’ll need to adjust some settings in Developer Tools (DevTools) to configure your Googlebot browser for accurate emulation.

How to open DevTools:

  • Press F12, or open the hamburger menu in the top-right corner of Chrome or Canary and go to More tools > Developer tools.
  • The DevTools window is docked within the browser by default, but you can change this. Use the second hamburger menu in DevTools to switch the Dock side or open it in a separate window.
developer tools

Key configurations in DevTools:

  • Disable cache:
    • You may have already done this if you’re using Chrome as your Googlebot browser.
    • Otherwise, in DevTools, open the hamburger menu, go to More tools > Network conditions, and tick the “Disable cache” option.
network conditions
  • Block service workers:
    • Navigate to the Application tab in DevTools.
    • Under Service Workers, tick the “Bypass for network” option.

Step 3b: General browser settings

Adjust the general browser settings to reflect Googlebot’s behavior.

  • Block all cookies:
    • Go to Settings > Privacy and security > Cookies, or enter chrome://settings/cookies in the address bar.
    • Select “Block all cookies (not recommended)”—sometimes it’s fun to go against the grain!
privacy and security
  • Adjust site permissions:
    • In Privacy and Security, navigate to Site settings or enter chrome://settings/content.
    • Under Permissions, individually block Location, Camera, Microphone, and Notifications.
    • In the Additional Permissions section, disable Background sync.
privacy and security permissions

Step 4: Emulate a mobile device

Since Googlebot primarily uses mobile-first crawling, it’s important to emulate a mobile device in your Googlebot browser.

How to emulate a mobile device:

  • Open DevTools and click the device toolbar toggle in the top-left corner.
  • Choose a device to emulate from the dropdown menu or add a custom device for more specific testing.

Key considerations:

  • Googlebot doesn’t scroll on web pages. Instead, it renders using a window with a long vertical height.
  • While mobile emulation is essential, I also recommend testing in desktop view and, if possible, on actual mobile devices to cross-check your results.
choose a device to emulate

How about viewing a website as a Bingbot?

To create a Bingbot browser, use a recent version of Microsoft Edge and configure it with the Bingbot user-agent.

Why consider Bingbot?

Summary and closing notes

Now, you have your own Googlebot emulator. Setting up a browser to mimic Googlebot is one of the easiest and quickest ways to view webpages as the crawler does. Best of all, it’s free if you already have a desktop device capable of installing Chrome or Canary.

While other tools like Google’s Vision API (for images) and Natural Language API offer valuable insights, a Googlebot browser simplifies the website technical audits—especially those that rely on client-side rendering.

For a deeper dive into auditing JavaScript sites and understanding the nuances between standard HTML and JavaScript-rendered websites, I recommend exploring articles and presentations from experts like Jamie Indigo, Joe Hall, and Jess Peck. They offer excellent insights into JavaScript SEO and its challenges.

Feel free to reach out if you have questions or think I’ve missed something. Tweet me @AlexHarfordSEO, connect on Bluesky, or find me on LinkedIn. Thanks for reading.

Back to Top
Alex Harford

Alex is a Technical SEO Manager at Salience, formerly a web developer, designer, and an in-house SEO.

With Moz Pro, you have the tools you need to get SEO right — all in one place.

Read Next

How to Optimize E-commerce Sitemaps with 1M+ Pages — Whiteboard Friday

How to Optimize E-commerce Sitemaps with 1M+ Pages — Whiteboard Friday

May 17, 2024
7 Ways SEO and Product Teams Can Collaborate to Ensure Success

7 Ways SEO and Product Teams Can Collaborate to Ensure Success

Apr 24, 2024
6 Things SEOs Should Advocate for When Building a Headless Website — Whiteboard Friday

6 Things SEOs Should Advocate for When Building a Headless Website — Whiteboard Friday

Apr 19, 2024