Determine Page Low Quality in Google’s System

e-tailwebstores low quality page

What are the factors Google considers when weighing whether a page is high or low quality, and how can you identify those pages yourself? There’s a laundry list of things to examine to determine which pages make the grade and which don’t, from searcher behavior to page load times to spelling mistakes. Rand covers it all in this episode of Whiteboard Friday.

Video Transcription

Howdy, e-tailwebstores fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about how to figure out if Google thinks a page on a website is potentially low quality and if that could lead us to some optimization options.

So as we’ve talked about previously here on Whiteboard Friday, and I’m sure many of you have been following along with experiments that Britney Muller from e-tailwebstores has been conducting about removing low-quality pages, you saw Roy Hinkis from SimilarWeb talk about how they had removed low-quality pages from their site and seen an increase in rankings on a bunch of stuff. So many people have been trying this tactic. The challenge is figuring out which pages are actually low quality. What does that constitute?

How can SEOs & marketers filter pages on sites to ID high vs. low quality?

As a marketer, as an SEO, there’s a process that we can use. We don’t have access to every single one of these components that Google can measure, but we can look at some things that will help us determine this is high quality, this is low quality, maybe I should try deleting or removing this from my site or recreating it if it is low quality.

In general, I’m going to urge you NOT to use things like:

A. Time on site, raw time on site

B. Organic visits

C. Assisted conversions

D. Raw bounce rate

Why not? Because by themselves, all of these can be misleading signals.

So quite a while on your site could be on account of somebody’s extremely connected with your substance. It could likewise be on the grounds that somebody is massively disappointed and they can’t discover what they require. So they will come back to the output and snap something different that rapidly answers their question in an available manner. Perhaps you have bunches of pop-ups and they need to click close on them and it’s elusive the x-catch and they need to look down far in your substance. So they’re exceptionally miserable with your outcome.

Skip rate works correspondingly. A high bob rate could be a fine thing in case you’re noting an exceptionally basic question or if the following stage is to go elsewhere or if there is no subsequent stage. In case I’m quite recently endeavoring to get, “Hello, I require some weight washing tips for this sort of treated wood, and I have to know whether I’ll expel the treatment on the off chance that I weight wash the wood at this level of weight,” and surprisingly turns out no, I’m great. Incredible. Much obliged to you. I’m altogether done. I don’t have to visit your site any longer. My ricochet rate was, high. Perhaps you have a bob rate in the 80s or 90s percent, however you’ve addressed the searcher’s question. You’ve done what Google needs. So bob rate without anyone else’s input, terrible metric.

Same with natural visits. You could have a page that is generally low quality that gets a decent measure of natural activity for some reason, and that could be on account of it’s as yet positioning for something or in light of the fact that it positions for a group of long tail stuff, yet it is frustrating searchers. This one is a tiny bit better in the more drawn out term. On the off chance that you take a gander at this throughout weeks or months instead of days, you can for the most part show signs of improvement sense, yet at the same time, without anyone else, I don’t love it.

Helped changes is an awesome case. This page won’t not change over anybody. It might be a chance to drop treats. It may be a chance to remarket or retarget to somebody or inspire them to agree to accept an email list, yet it may not change over specifically into whatever objective transformations you have. That doesn’t mean it’s low-quality substance.

What constitutes “quality” for Google?

So Google has some ideas about what’s high quality versus low quality, and a few of those are pretty obvious and we’re familiar with, and some of them may be more intriguing. So…

  • Google wants unique content.
  • They want to make sure that the value to searchers from that content is actually unique, not that it’s just different words and phrases on the page, but the value provided is actually different. You can check out the Whiteboard Friday on unique value if you have more questions on that.
  • They like to see lots of external sources linking editorially to a page. That tells them that the page is probably high quality because it’s reference-worthy.
  • They also like to see high-quality pages, not just sources, domains but high-quality pages linking to this. That can be internal and external links. So it tends to be the case that if your high-quality pages on your website link to another page on your site, Google often interprets that that way.
  • The page successfully answers the searcher’s query.

This is an intriguing one. So if someone performs a search, let’s say here I type in a search on Google for “pressure washing.” I’ll just write “pressure wash.” This page comes up. Someone clicks on that page, and they stay here and maybe they do go back to Google, but then they perform a completely different search, or they go to a different task, they visit a different website, they go back to their email, whatever it is. That tells Google, great, this page solved the query.

If instead someone searches for this and they go, they perform the search, they click on a link, and they get a low-quality mumbo-jumbo page and they click back and they choose a different result instead, that tells Google that page did not successfully answer that searcher’s query. If this happens a lot, Google calls this activity pogo-sticking, where you visit this one, it didn’t answer your query, so you go visit another one that does. It’s very likely that this result will be moved down and be perceived as low quality in Google.

  • The page has got to load fast on any connection.
  • They want to see high-quality accessibility with intuitive user experience and design on any device, so mobile, desktop, tablet, laptop.
  • They want to see actually grammatically correct and well-spelled content. I know this may come as a surprise, but we’ve actually done some tests and seen that by having poor spelling or bad grammar, we can get featured snippets removed from Google. So you can have a featured snippet, it’s doing great in the SERPs, you change something in there, you mess it up, and Google says, “Wait, no, that no longer qualifies. You are no longer a high-quality answer.” So that tells us that they are analyzing pages for that type of information.
  • Non-text content needs to have text alternatives. This is why Google encourages use of the alt attribute. This is why on videos they like transcripts. Here on Whiteboard Friday, as I’m speaking, there’s a transcript down below this video that you can read and get all the content without having to listen to me if you don’t want to or if you don’t have the ability to for whatever technical or accessibility, handicapped reasons.
  • They also like to see content that is well-organized and easy to consume and understand. They interpret that through a bunch of different things, but some of their machine learning systems can certainly pick that up.
  • Then they like to see content that points to additional sources for more information or for follow-up on tasks or to cite sources. So links externally from a page will do that.

This is not an exhaustive list. But these are some of the things that can tell Google high quality versus low quality and start to get them filtering things.

 

THESE can be a good start:

So what I’m going to urge you to do is think of these as a combination of metrics. Any time you’re analyzing for low versus high quality, have a combination of metrics approach that you’re applying.

1. That could be a combination of engagement metrics. I’m going to look at…

  • Total visits
  • External and internal
  • I’m going to look at the pages per visit after landing. So if someone gets to the page and then they browse through other pages on the site, that is a good sign. If they browse through very few, not as good a sign, but not to be taken by itself. It needs to be combined with things like time on site and bounce rate and total visits and external visits.

2. You can combine some offsite metrics. So things like…

  • External links
  • Number of linking root domains
  • PA and your social shares like Facebook, Twitter, LinkedIn share counts, those can also be applicable here. If you see something that’s getting social shares, well, maybe it doesn’t match up with searchers’ needs, but it could still be high-quality content.

3. Search engine metrics. You can look at…

  • Indexation by typing a URL directly into the search bar or the browser bar and seeing whether the page is indexed.
  • You can also look at things that rank for their own title.
  • You can look in Google Search Console and see click-through rates.
  • You can look at unique versus duplicate content. So if I type in a URL here and I see multiple pages come back from my site, or if I type in the title of a page that I’ve created and I see multiple URLs come back from my own website, I know that there’s some uniqueness problems there.

4. You are almost definitely going to want to do an actual hand review of a handful of pages.

  • Pages from subsections or subfolders or subdomains, if you have them, and say, “Oh, hang on. Does this actually help searchers? Is this content current and up to date? Is it meeting our organization’s standards?”

Leave a reply

Your email address will not be published. Required fields are marked *