Core Web Vitals - Unmasked, No Hype Just Data

This past week SEO Fight Club, a weekly youtube show, published its 125th episode. What remains remarkable is the amount of cutting-edge SEO news this show makes more often than not. Afterwards, my mind was on fire about how needed this type of analysis is for the SEO community and at the same time acknowledging that the amount of math and words that sound like we’re in the wrong class are what stops this from getting through when it takes so much to be able to take it in.

Core Web Vitals Ranking Factor – Yes or No

The topic – can we determine if or when Core Web Vitals is having an impact on rankings? Short version – the data does not support at this time that Core Web Vitals is impacting rankings.

So how does one get that kind of data? We’ll get to that.

I think it is important to amplify this information because one look at the number of articles and general lamenting on social media, all that would suggest that a large number of businesses, SEO tools and agencies are spending an enormous amount of time, energy and money on making their sites score better in the Core Web Vitals section in Google’s Search Console.

On Confessions of an SEO earlier in the summer, I did a June 2021 Confessions of an SEO podcast episode on what I thought in general about Core Web Vitals – basically that spending so much time on CWV is in my opinion the equivalent of chasing the green lights in Yoast – that if you have green lights you’re done.

What is being pushed heavily is the concept that IF your pages score better on Core Web Vitals in the search console, then your pages will rank better.

What problems are Core Web Vitals trying to solve?

Basically this a component of page speed but from the client-side. How well does YOUR website perform on the device used to access your page? In short, a ton of things outside of your direct control.

So you’re tuning the technology that your site is built on so that it’s within a range of pre-determined ranges. That’s how technology works. There are specifications and other things are supposed to perform within a set of tolerances. I experienced this in optical media production – there were red book standard for audio and blue book standards for DVD.

You can imagine a desktop computer will probably render your content over their internet and device pretty well. Likewise, someone on an old cell phone, standing out at a bus stop, that phone might have some issues. Not just on YOUR site but everything that is rendered on that old phone no matter how fast it should be, that device is just not going to perform in the same way.  CWV is trying to anticipate those issues and giving feedback once there is enough data to say basically – “you could do better.”  

 I get it, if there is some ranking factor power to CWV, then yeah, put some effort into it.

But what if there is no there there, at least, right now in Core Web Vitals? 

How would we know either way?

That’s the gist of the SEO Fight Club episode, I’m going to put the URL here so you can watch it yourself. 

Now, if you’re like me, sometimes when words like “correlation coefficients” get tossed around – I need a scorecard so I know what’s going on and can keep up with signficance. So here is a synopsis of the show which you can and should watch.

Ted Kubaitis is the host of the Seo Fight Club.

In this episode, this data was gathered and analyzed by Lee Witcher, arguably one of the first independent SEO researchers, who has spent the last three years building a trove of SEO tests designed to tease out of the Google algorithm (aka the rules)  and others some clues to see if what is typically believed in SEO to be a ranking factor is indeed a ranking factor.

Google Measures Website Performance Based on Three Values

This type of analysis would be difficult if not impossible to do without an essential SEO tool developed by Ted Kubaitis, namely Cora SEO software . A pioneer of competitive intelligence analysis, Ted believes that scientific, data-driven SEO is the current and future path to success in SEO and his success is hard to argue against.

With Cora, one can run a keyword search, pull back the top 100 results along with various on and off page factors – some you need API’s for but even without them what you get is impressive.

So Lee Witcher took his “box of rocks” search phrases that he has been looking at for the past 3 years which are 40 keyword searches over 40 niches that cover a wide range of commercial to informational, highly competitive to low value. 

screenshot of the video slide listing all the search phrases used in this study

So the idea being that if Core Web Vitals is live, then pages with the better scores “should” rank towards the top of the search results. Conversely, those with lower scores should rank lower in the search result page.

He ran all 40 searches to collect the top 100 results for each search (yes, that’s 4,000 data points), then collected the core web vitals of each of those 4000 URLS and then ran co-relational formulas against their position in the search results (1 – 100) against their CWV scores.

Simple right?

Did I mention that Lee is somewhat of a data expert?

He can squeeze information out of a spreadsheet full of data so fast that most of us, even seeing it done with our own eyes, can only believe what he did just had to be magical.

I cry “Uncle” on your behalf.

He then found an API that would run overnight and look up each one of those 4000 URLs – those 4000 pages and return back Google’s own Page Insights scores. There are 4 that comprise Core Web Vitals – 3 component items and then a composite score combining all three.

Screenshot of the LCP, FID and CLS range of scores from[

So Lee compared on a per URL basis, each of these components scores against the rank position of the URL.

He did that 4 times – one for each of the 3 components and the composite score.

So now each URL has a ranking position, 1, 2, 3, 4 and he ran a correlation forumla comparing each one of the scores against that URL’s ranking position which would be 1 – 100. 1 being the URL that ranks in the #1 spot and 100 being the LAST URL in the serp.


Basically a correlation formula is a mathematical way of measuring the strength of how related, how strongly related two things are to each other, or not.

Lucky for us this kind of formula is built into spreadsheet programs. No math degree is required.

So the question that Lee is trying to find out is if this particular URL is in the #1 spot and this same URL has these CWV scores –  – is that page’s rank correlation with the score is influencing the page’s rank position (another way of saying statistically significant) OR is it random relationship.

These formulas have names. Pierson’s and Spearman’s – and they are applicable across a number of cases where something can be measured and then evaulated. I include that so you know they aren’t just pulling something out of the air. These are established standard ways to determine if something is random or not.

The First Input Delay – this is a scatter plot where each result of that equation is mapped. The data would be moving from left to right. The data from the highest-ranking pages are on the left, the data from the lowest ranking page is the last on the right.

First Input Delay

Do you see a pattern? 

Cumulative Layout Shift

Largest Contentful Paint

Performance Score

OK, so even without a math or science degree we can see with our own eyes that each of these appear random and specifically not rising to the level of being statistically significant which is the red line in these charts. That also is another thing that in data analysis is a standard threshold between randomness and significant relationship strength.

In fact, looking at Lee’s data – I almost fell out of my chair when Ted noted that if you look at it – a lot of the better scores are lower down in the SERPS and he said the data make one think that CRW could be a negative ranking factor? LOL!

Then the conversation turned to how do we know Core Web Vitals is even rolled out? 

Here’s what John Mueller says about CRW just this past week (Sept 27th) and how it might impact a site and where he shared the timeline of when Core Web Vitals update was pushed out –

The reason I don’t think that would be related is because we started rolling this update out I think in July and it was finished at the end of August.


If it’s been live since the end of August – we’re either looking at a super weak ranking factor, one that may build up over time, or Core Web Vitals isn’t at all what Google has told us that it is. Truth is we don’t know based on comparing what is being said to what is being said within the SERPs.

So this begs the question – why are so many SEO’s and software tools investing in CWV’s improvement for pages?

What I appreciate about this group of SEO’s is that they aren’t saying unhelpful things just to be disagreeable or to be sensational – they really don’t care whether Core Web Vitals works or doesn’t. They just want to be able to see some data that shows it does impact rankings and when it does, THEN is a good time to invest.

And after seeing this data and reading John Mueller’s explanation that Core Web Vitals is not going to be responsible for sudden and deep drops in ranking. That makes sense. 

Lee Witcher is going to rerun this data again in about 6 months or so because IF and WHEN Core Web Vitals starts to make a difference in ranking, this is how we’re going to see it and know it’s a thing.

I don’t know about you but that’s pretty exciting to this SEO.