What do you think of SearchMetrics' claim that there are no longer universal ranking factors?
-
I agree that Google's machine learning/AI means that Google is using a more dynamic set of factors to match searcher intent to content, but this claim feels like an overstatement:
Let’s be quite clear: Except for important technical standards, there are no longer any specifc factors
or benchmark values that are universally valid for all online marketers and SEOs. Instead, there
are different ranking factors for every single industry, or even every single search query. And these
now change continuously.Keyword-relevant content, backlinks, etc. still seem to be ranking factors across pretty much all queries/industries. For example, I can't think of a single industry where it would be a good idea to try to rank for [keyword] without including [keyword] in the visible text of the page. Also, websites that rank without any backlinks are incredibly rare (unheard of for competitive terms).
Doubtless some factors change (eg Google may favor webpages with images for a query like "best hairstyle for men" but not for another query), but other factors still seem to apply to all queries (or at least 95%+).
Thoughts?
-
Were they referencing Rank Brain in their article? The statement sounds similar to an explanation given on what Rank Brain is and how it impacts search. It does seem like a bit of hyperbole but I see their point and I agree with it to a certain extent. I believe the purpose of a machine learner is to continuously innovate without human intervention so that improvements are made while you sleep. It's my understanding that Rank Brain does this based on feedback from users. It's the perfect solution to handling the complexity of search, and would result in a continuously changing algorithm.
I do see a lot of websites ranking without backlinks. Try any local home services query - they're mostly propped up by citations which is a little different than your standard backlink.
-
Agreed, I also see their point to some extent. I think Google's ranking factors are much more dynamic than they used to be. Google's rankings are also becoming for more intuitive and less metrics-driven (eg keyword density). SEO studies are increasingly having trouble explaining Google's algorithm. For example, we all know that social shares and engagement metrics correlate strongly with Google rankings, but nobody is quite sure what the mechanism for that is.
"Likewise, if you're a local plumber and the top results have 1 or 2 referring domains but great content, ranking is going to take more focus on quality onsite than the car hire example."
Or, maybe they are ranking in spite of not having links, and if you get great content + 5 links you'll be #1...hard to say!
"what it takes to rank in each one will require different strengths and weaknesses"
Agreed, because Google is getting close to actually measuring what the searcher wants. i.e. Google has some way of knowing (through user interaction data, maybe?) that a person searching for "hair styles 2016" wants a photo-heavy article, but a person searching for "barack obama policies" wants a long form text article. Yet, IMO, keyword in text and backlinks will be important factors in both cases.
-
I wouldn't say that I strictly agree with it but I do see their point.
The way I look at it is quite similar though from a slightly different angle. For any given vertical, where you rank is entirely relative to the other sites presented in that query.
For example, if you're in the car hire industry and all of your competitors have incredibile link profiles and passable onsite factors then for your industry, your link profile is going to be an important ranking factor for you.
Likewise, if you're a local plumber and the top results have 1 or 2 referring domains but great content, ranking is going to take more focus on quality onsite than the car hire example.
Now, obviously the approach to outranking another site shouldn't be to just copy what they do and you should be exploiting their weak points but no amount of great content is going to push your car hire company above a competitor with 2,000 legitimately quality referring domains!
What this all means is that while Google may not be directly measuring each vertical differently, what it takes to rank in each one will require different strengths and weaknesses. This is conjecture so take from it what you will; it's mostly just my 2c and viewpoint on the whole thing.
-
It's marketing hyperbole.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking drop after sub domain to sub directory migration. Usual?
Hi all, We had our help articles on sub-domain help.website.com. Then we moved it to sub directory website.com/help/. We expected ranking improvement of website.com as there is a wide saying of benefiting from sub domain to sub directory migration. We have noticed that ranking improvement of new sub directory pages (website.com/help/) but not for any main website pages (website.com). I presume that link juice from main website has benefited new sub directory pages but main website lost ranking due to the page rank dilution. Do you agree? Any ideas? Thanks
Algorithm Updates | | vtmoz0 -
By Using interstitial (popups) on the webiste, will google penalize ranks for desktop and mobile both ?
We have implemented interstitials (pop-ups) on a website (Business Articles Website). The popups are basically used for getting leads from the website (using Signup popups). Before Popup implementation the traffic was steady, After the implementation, the traffic started to decay after a couple of weeks and due to the drop we disabled the popup from the website and initiated a force crawl and within next few weeks, we observed traffic gaining back to its normal trend. Within these timelines drop in desktop traffic was more and mobile traffic remain steady. As per Google guidelines, interstitials are more likely to be affected on mobile than desktop. But in our case, desktop traffic was hit more than mobile. So we carried out this experiment for 3 months. And we observed traffic decay and regain. Is interstitials the only culprit here (as the drop is only in desktop) or Can there be some other reasons as well for the traffic drop? bF7hc
Algorithm Updates | | iQuanti1 -
Duplicate website got indexed: Caused rank drop?
Hi all, We have replica of our website with exact pages and content. That website got indexed by mistake and allowed for bots for more than 10 days. Our ranking dropped now and we moved from 2nd page to 5th page. But previously we had this happened and didn't hurt much. We got punished now? Thanks
Algorithm Updates | | vtmoz0 -
Who else is noticing a shift in deeper pages ranking?
Without mentioning names, we're noticing a shift in many of our clients ranking pages. Previously many of them held page 1 positions with their home page. We've been building brand only anchor text to these pages for some time now and there's a noticeable change in visibility to the domain as a whole displayed in GWT and there's an uplift in organic traffic too. It just happens that some of our clients already had pages in the root directory that were very optimised for the clients' head terms, but all of a sudden, these sub pages with very few inbound links have started ranking in the place of the home pages. I've attached a screenshot of the landing page organic traffic. The pages in question have been there for at least 8-10 months. These inner pages would not normally have been able to hold their ground in this position and I'm concerned that this is a temporary change. I can see this going one of two ways; (i) home page beings to out rank sub page as before, (i) sub page loses ranking ability and home page rank does not come back. My questions to the community are therefore; **Has anyone else noticed this shift in ranking behaviour? ** What are everyone's thoughts on this? - Will it remain this way? From this query I can easily ask another wider question; Good advice across the internet says we should be building strong brand links and citations to our clients' domains. Typically brand links go to the homepage, which should provide the homepage and (to a lesser extent the domain) with a ranking/traffic/visibility uplift. However, as I'm noticing other pages now picking up ranking boosts as a result of this; **Should we still be trying to gain links to these more commercial landing pages? ** How are others building high quality links to pages full of commercial copy? I hope this can spark a little bit of a debate. I look forward to hearing everyone's thoughts. Thanks yPOEjVA.png
Algorithm Updates | | tomcraig860 -
Does Site Size Influence Rank?
The Scenario:
Algorithm Updates | | kchandler
Currently one of my clients has 7-8 products that they sell on their website. For each product they have two different pages one with the product info and one with a video demo. So the pages began to split their authority as they began receiving new links. Since only one of the two pages for each product rank i suggested that we combine the two and redirect the video page to the product page to increases it's authority and rank. The Clients Response:
After explaining my reasoning and next steps the client mentioned that he thought a site's size was a ranking factor. I had never heard of this before so i told them i would do some research to prove my point, after a little digging around i am now even more confused. http://www.seroundtable.com/google-size-ranking-17044.html http://www.webmasterworld.com/google/4591155.htm The Question:
Does a websites size/amount of content indexed in Google actually effect your sites ability to rank? I look forward to everyones feedback, thanks Kyle1 -
Why do in-site search result pages rank better than my product pages?
Maybe this is a common SERP for a generic product type but I'm seeing it a lot more often. Here is an example SERP "rolling stools". The top 4 results are dynamic in-site search pages from Sears, ebay and Amazon (among others). I understand their influence and authority but why would a search return a dynamic in-site SERP instead of a solid product page. A better question would be - How do I get my in-site SERPs to rank or how do I get my client's page to rise above the #5 spot is currently ranks at? Thanks
Algorithm Updates | | BenRWoodard0 -
Webpage is ranking on google.ie / google.co.uk but not google.com?
One of our site webpage appears to be found in the first few pages on google.ie / google.co.uk but not on google.com. Is there such a thing being penalised on a specific Google domain? Traffic is healthy despite this but I want to rank well for the page in google.com. Any ideas?
Algorithm Updates | | notnem0 -
Ranking Tracking Tool Not Accurate?
Is google still updating the algo on a daily basis or for the most part are you other mozzers seeing your rankings stick? I ask because the rank tracking software I use locally on my laptop shows me inaccurate rankings, as well as the SEOMOZ tool (which was just updates yesterday). I am not sure why this is happening, but as of yesterday I lost four page one rankings, which I didn't deserve anyway, and was apparently a fluke until they did the PR update. So I dont mind, but I am curious if they are still tweaking on a daily basis or if its safe to continue link building. I dont want them to make another change and have it affect my rankings in a negative way.
Algorithm Updates | | getbigyadig0