Resources

Improving SEO with Google’s Core Web Vitals: The Role of Nostra.ai

June 17, 2024

In this blog

Book a Free Demo

Site speed matters. Nearly 70% of consumers admit that page speed impacts their willingness to buy from an online store. That's why understanding and improving your  store's Core Web Vitals is crucial for your business's overall success. So, what are Core Web Vitals? Simply put, Core Web Vitals are a subset of factors that are part of Google’s “page experience” score, a.k.a., Google’s way of judging your page’s overall user experience (UX). The Core Web Vitals are a set of three user experience metrics:

  • Largest Contentful Paint (LCP) indicates when the largest element on a given page (usually a photo or video) becomes visible to your customers. LCP measures the time it takes the element to render and begins when a user requests the page via its URL, a good LCP is 2.5 seconds or less. 
  • Interaction To Next Paint (INP) determines a page's level of responsiveness to human interactions based on how long it takes to react to actions like a click or scroll. Google says the “final INP value is the longest interaction observed, ignoring outliers.” A good INP is 200 milliseconds or less.
  • Cumulative Layout Shift (CLS) refers to how much a page’s layout or content moves during a customer’s visit. A score of 0 means no shifting is present, so the closer to 0 your score is, the better your user experience will be. A good CLS score is 0.1 or less.

Websites must meet specific thresholds to secure an SEO benefit. Those who fail to do so likely experience a decline in search engine rankings. That is why we are here today. In this article, we’ll discuss Google’s Core Web Vitals, their importance in SEO, how you can find them for your store, and what Nostra can do for your store to improve them. 

Let’s dive in.

How Do Core Web Vitals Affect SEO? 

Google created Core Web Vitals to standardize key aspects of user experience across the web, focusing on speed, responsiveness, and visual stability. These user-centric metrics serve as quantifiable indicators for site performance, encouraging developers to optimize their sites for a better user experience and higher SEO rankings. 

So, how does Google evaluate all of the websites out there? Google does this in a multitude of ways, but one of the many signals they use is bot experience. Bot experience is when Google sends out search engine bots, also known as Googlebots, to crawl each page of your site.

These bots are used to increase search-ability, engage website visitors, capture analytics data, and more. Search engine bots are particularly noteworthy in the eCommerce context, as they are sent out to examine sites on a regular basis to help compile search results, meaning your store could be mid-crawl at this very moment.

Bots provide many important services across the web, and they don’t always understand or efficiently process JavaScript. To combat such issues and ensure your website provides a top-notch experience for these bots, Nostra.ai's Crawler Optimization product was created. Our product intelligently streamlines the process of website crawling by search engine bots (like Googlebots), providing crawlers with a simplified, statically rendered version of the website. This makes the crawl process more efficient and conserves resources, especially when it comes to JavaScript. Eliminating the need to process complex elements like JavaScript results in significantly faster page load times for the crawler. This process signals to search engines that the website is both fast and efficient, giving your website an A+ in terms of site performance. 

How does our Crawler Optimization enhance SEO?

  • Faster Indexing: Improved loading speed and crawl efficiency mean that new or updated content can be indexed by search engines more quickly, increasing its visibility.
  • Deeper Crawling: Optimizing crawl budget helps search engines discover more pages on the site and understand the site structure better.

We’ve helped numerous eCom stores up their game in site performance and crawl-ability, but what happens to the eCom stores that don’t improve their site’s performance? They will face the following consequences:

  • Poor user experience 
  • Missed conversion opportunities
  • Loss of visibility in search engines
  • Lower website traffic

Neglecting your site’s performance and crawler readability can significantly impact your business's bottom line. Therefore, let's explore how you can identify your Core Web Vitals and discuss strategies to enhance them.

Where To Find Your Core Web Vitals Score

If you're new to Core Web Vitals and unsure where to begin, don't worry—we're here to help. Before diving in, it's crucial to understand the difference between lab data and field data, as this will guide you in determining the source of your Core Web Vital metrics. Additionally, when analyzing speed scores, keep in mind the type of data you are reviewing.

Lab vs Field Image Examples
  • Lab data was created for developers. As such, it is best used to test changes, debug issues, and determine good practices. Since lab data is gathered from a single version of a page, results and recommendations may or may not match the field data.
  • Field data is user data that reflects the real-world experiences of your customer base. It is the most accurate way to measure what problems users may be facing and what improvements need to be made, but it may not always identify why that problem exists or how to fix it. 

Both lab and field data can positively contribute to your site’s performance if you understand their differences and how to best apply them to your goals. However, Google Chrome recommends that speed score evaluators prioritize field data—specifically, the Core Web Vitals—over lab data because it captures real-world conditions, including personalized content such as ads and A/B tests, which lab data might miss. Lab data typically focuses on initial load experiences, while field data evaluates cached experiences from repeat visits, affecting metrics like cumulative layout shift (CLS). For detailed comparisons of how these data types affect different metrics, Google Chrome’s documentation offers further insights.

So, now that you have a good understanding of Lab vs. Field Data, it's time to go over the different options in finding your Core Web Vitals. Here are two options:

 CrUX: Chrome User Experience Report

Core Web Vitals Example in CrUX Dashboard

The Chrome User Experience Report (CrUX) is a public dataset that supports Google’s key performance tools, including the official dataset for Google’s Web Vitals program. It compiles field data from a subset of Google Chrome users across millions of websites, updated monthly. Websites featured in the dataset can access their data via the CRUX Dashboard.

Benefits of CrUX:

  • Tracks a site's field data over time.
  •  Provides insights into real user experiences for comparison.

Limitations of CrUX:

  • Only certain Chrome users (Desktop or Android) are included.
  • Not all websites are tracked unless they are public and have significant traffic.
  • Only monitors public data, excluding any logged-in user experiences.

Lighthouse

Lighthouse Dashboard Example

Google Chrome’s Lighthouse is an open-source tool that allows developers to audit the performance, accessibility, SEO, and more of any webpage, whether public or private. It measures lab data primarily reflecting a first-load experience in a controlled environment. It measures lab data, primarily reflecting a first-load experience in a controlled environment. The tool is especially valuable in assessing factors such as site speed, JavaScript execution, and more, but it must be run either in Chrome DevTools, from the command line, or as a Node module. 

Benefits of Lighthouse:

  • Assists with the diagnosis of performance issues, especially on slower networks and connections
  • Returns data from any page—public or private
  • Tests “performance and best practices” during a page’s first load

Limitations of Lighthouse:

  • Captures a snapshot of data that may or may not be reflective of the general user experience
  • Lab data may not test personalized content like cookies and advertisements at all
  • Traditionally struggles to capture:
    • Pages loaded with a warm cache—meaning after the initial page load
    • User interactions

Lighthouse and CrUX serve different but complementary roles in web performance optimization. Here are the distinctions: 

This comparison is crucial for developers as they use both tools for optimizing web performance. While these common speed assessment tools vary in data collection and usage, they have a few factors in common:

  • Speed scores are complex and cannot be measured through one metric.
  • Regardless of the data type, there is a high degree of score variability due to personalized page logic, efficacy of local networks, server and/or browser delays, third-party applications, and more. Therefore, there is no single representative user experience or data collection environment.
  • Every speed score calculator will have pros and cons, whether it’s a specific user sample or a rolling average.

Actionable Tips to Improve Your Core Web Vitals

If you're ready to tackle improvements in-house to boost your eCommerce store's performance, we’ve got you covered. Below we outline effective steps to optimize each of the Core Web Vitals, helping to enhance site speed, responsiveness, and stability.

1. Optimize Largest Contentful Paint (LCP)

  • Optimize Images: Compress images to reduce file sizes without sacrificing quality and consider converting images to more efficient formats like WebP. To learn how to do this, check out our article on serving images in Next-Gen Format
  • Efficient Asset Loading: Utilize lazy loading for images and non-critical resources to prioritize loading the most important content first.
  • Improve Server Response Times: To enhance server response times effectively, one effective strategy is using a Content Delivery Network (CDN). While traditional CDNs mainly handle static content and might operate from a limited number of network locations, there are advanced solutions that overcome these limitations. One such option is our  Edge Delivery Engine. Our proxy service is designed to minimize latency and boost server performance through sophisticated caching techniques and advanced performance optimizations. Unlike typical CDNs, Nostra supports both static and dynamic content with its hybrid server-side rendering capabilities. It operates from over 310 edge locations, which are strategically placed to ensure they're within 50 milliseconds of 97% of the global population. This extensive coverage guarantees superior performance and expedited content delivery, making it a robust alternative to traditional CDN solutions.

2. Enhance Interaction to Next Paint (INP)

  • Break Down Long Tasks: Split up large, blocking JavaScript tasks into smaller, asynchronous chunks to avoid delaying input responsiveness.
  • Optimize JavaScript Loading and Execution: Minimize or defer JavaScript until needed, and prioritize accessing and processing critical scripts first.
  • Efficient Handling of User Inputs: Use efficient event listeners and manage input handlers properly to ensure a responsive interaction experience.

3. Minimize Cumulative Layout Shift (CLS)

  • Specify Dimensions for Media Elements: Assign width and height attributes to images, videos, and other media to prevent unexpected layout shifts.
  • Stable Advertisement Elements: Ensure ad sizes are declared beforehand so content does not reflow unexpectedly when ads load.
  • Dynamic Content Management: Introduce new UI elements below the fold or in ways that do not disrupt the user’s current interaction.

4. Monitor and Measure

  • Field Tools for Real-World Data: Utilize tools such as Google's PageSpeed Insights, Lighthouse, and the Chrome User Experience Report to monitor these vitals.
  • Conduct Regular Performance Reviews: Regularly review performance data to identify and mitigate issues impacting your Core Web Vitals scores.

What the Google API Document Leak Says About Site Performance

Over the years, Google has provided limited transparency when it comes to the impact of site performance and user experience on search engine rankings. While many in the search industry have long speculated that Google uses engagement signals like click-through-rate and time on site to inform search rankings, Google has never publicly confirmed these factors. However, after a recent GitHub leak, which involves thousands of documents from Google's internal Content API Warehouse, many of these ranking signals appear to be confirmed as part of Google’s search algorithm. 

Here are the main takeaways: 

  • User engagement "Twiddlers" adjust rankings: Google relies on re-ranking functions called "Twiddlers" to increase or decrease rankings based on quality and engagement data. "NavBoost," for example, adjusts rankings based on user navigation patterns (primarily click signals), which are directly influenced by your website's overall performance. 
  • Not all clicks are the same: Click data can tell Google a lot about the user experience, which is granularized into metrics like "goodClicks" and "badClicks." The metric "lastLongestClicks, " in particular, measures engagement time on your website following a click — something that's dampened by a slow loading speed
  • Google tracks engagement data from their Chrome browser: The leak revealed a specific module named "chromeInTotal," which indicates that the search engine utilizes data from Google's own browser to determine rankings. When in doubt, use CrUX to ensure your website's Core Web Vitals are in top shape. 

There's a lot more to unpack from the Google leak, especially if you look beyond the scope of site performance. Many are notions that the SEO community already believed to be true, but denied by Google — from the existence of site-wide authority scoring to the use of Chrome browser data. But there are also some noteworthy revelations that everyone needs to know, like: 

  • Complete version history: Google keeps a complete record of every version of your website that was ever indexed. 
  • 'hostAge' search feature: New websites are "sandboxed" into low-tier rankings, which Google monitors for spammy activities. 
  • Font weight matters: Bold-face text is tracked and used by Google to determine relevance. 
  • 'Registration Info' to moderate used domains: Google also tracks your website's domain registration information, including when it was originally created and its expiration dates. 
  • Location-based demotions: If your website is optimized for a target location, its rankings are lowered for other location-specific keywords. 
  • 'smallPersonalSite' search feature: Google determines whether or not a website is small or owned by an individual, which can impact rankings supposedly depending on the search query. 

While the leak is confirmed to be true, none of the documents specified how the leaked ranking features and functions are weighted. We also don’t know how much Google’s internal systems have already changed since these documents were created, and Google has refused to comment on the leak publicly. However, it's clear that content quality, brand authority, and user experience remain at the forefront of Google's ranking algorithm.

Nostra’s Effect on Core Web Vitals and Overall Site Performance

Improving Google's Core Web Vitals can be a resource-intensive task that takes considerable time and effort. At Nostra, we offer a streamlined solution that requires no work on your end, allowing you to fast-track these improvements. Our clients on Shopify have experienced general improvements in their Core Web Vitals by 20-30%, while those on Salesforce Commerce Cloud have seen increases of over 40%.

Additionally, we focus on enhancing the Time-To-First-Byte (TTFB)—a critical measure of web server responsiveness. Our Edge Delivery Enginesignificantly reduces TTFB, typically by 5-10x, which improves Core Web Vitals and overall user experience.

The secret to the effectiveness of our product suite is its sophisticated use of smart caching and advanced performance optimizations. These tools are expertly designed to manage your dynamic eCommerce platform efficiently, ensuring significant enhancements in performance. This strategy doesn't just speed up load times through our cutting-edge infrastructure and intelligent optimizations; it also boosts your SEO rankings and enhances the overall customer experience by aligning with Google’s performance metrics.

Ready to see how Nostra can transform your business with lightning-fast load times and a seamless user experience? Schedule a demo with us today and discover how we can improve your Core Web Vitals and overall bottom line, just like we helped the pioneering hair removal brand Nood achieve a 24% increase in conversions. Check out our case study with them for more details.

Related Blogs:

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Discover How Speed is Key to Conversions

Over a hundred DTC companies already trust Nostra to deliver blazing speed and converted customers