I need help compiling solid documentation and data (if possible) that having tons of orphaned pages is bad for SEO - Can you help?
-
I spent an hour this afternoon trying to convince my CEO that having thousands of orphaned pages is bad for SEO. His argument was "If they aren't indexed, then I don't see how it can be a problem."
Despite my best efforts to convince him that thousands of them ARE indexed, he simply said "Unless you can prove it's bad and prove what benefit the site would get out of cleaning them up, I don't see it as a priority."
So, I am turning to all you brilliant folks here in Q & A and asking for help...and some words of encouragement would be nice today too
Dana
-
Agreed on all counts Jason, not to mention the improved customer experience because we won't have people landing on those God-awful ugly and useless pages!
From a server perspective, could deleting 8,000 files (pages, images, PDFs) results in our site speed improving too? Or would it likely have no impact?
-
So you have roughly 8,500 pages that are part of your customer experience and that you want customers to be able to navigate to from your site and presumably would like customers to find on Google. (from Screaming Frog).
But only 7,500 only pages are in Google's index. So best case, roughly 1,000 of your good pages (almost 12% of all the pages on your site) don't exist in organic search. Worst case, is that some of those 7,500 pages in google are depreciated pages that aren't part of your active site, making the percentage of live pages in google even worse.
It's very possible that a portion of your google crawl budget is being consumed by pages that don't help you. If you get those pages out of the index, you stand a better chance to get your 1000 good pages into the index.
-
Hi Jason,
Ok, here is what I saw in Screaming Frog:
27,616 total spidered URLs, of which:
- 8,494 are HTML pages
- 45 are CSS files
- 14,687 are images
- 4,287 are PDFs
Google says we have only 7,540 URLs indexed (of all types) - I know for a fact that at least 500 orphaned pages are indexed in Google. It seems to me, then, that Google is indexing content that isn't important to us, and perhaps not indexing other content that is important to us because it's having trouble telling what's important and what's not.
Any insights on that Jason? What do you make of it?
-
Hi Jason,
I'm just following up as I get my ducks in a row on this one. Above in your comment you said "Google Count of Pages - Screaming Frog count of Pages = # of Orphaned Pages" - to be perfectly accurate, this would only give me the number of orphaned pages that are indexed. There could be many additional orphaned pages that are not in Google's index.
My follow up question is, should I be concerned about those too? Or are orphaned pages that aren't indexed not worth cleaning up? I think I already know the answer (Yes! Clean those up too because they can interfere with crawl rate and site speed...)....but I want to know your take on it please. Thanks so much!
Dana
-
Tempting! Very tempting.:-)
-
I would not do this if I was an employee... but.... I would ask him to bet me an amount that would be equivalent to about "one month's pay" on the results.
He is a chicken so he wouldn't accept that bet. And if he did accept I would want it in writing.
-
Thanks EGOL. You made me chuckle, because all of these things crossed my mind. I did go home mad yesterday, and I don't get mad very easily or very often. I usually welcome the idea of explaining SEO strategies and tactics to newbies and laypeople (as is evidenced by my many posts here in Q & A).
Let's just say - my feelers are out looking at other possibilities.
-
In my opinion, the links are still evaporating pagerank.
If some of these pages are still in the index they could be counting as thin/duplicate content.
-
What would your response be to that?
- thinks for a while *
I would be mad about this. This is why I prefer to be self-employed.
I don't know the temperament or personality of this person.
I might not be working there much longer.
It seems to me that the effort required to cut links into these pages is tiny and the potential for gain is pretty high.
Downside risk is zero. Upside opportunity is good. He is a chicken and a fool.
-
EGOL, I thought I would just follow up on these thin content "Reviews/Ratings" pages. They are blocked from Google crawling them via the robots.txt file. Is this enough? Or are they still diluting the product page's authority just by being there?
Thanks!
Dana
-
Thanks EGOL,
And yes, they are.
The comment I received when trying to explain that those links were draining authority off the product pages was "No they aren't. Whatever PageRank the product page has, it has, regardless of whether the links are there or not."
What would your response be to that? I tried to explain it several different ways, but he just looked at me like I was full of malarkey...He is a visual person. Perhaps I should try a diagram?
It's difficult going into a situation like this when the opening premise in the other person's mind is that he knows more about SEO than I do, because all SEO is in his mind is a bunch of guesswork.
Sorry, moral's a bit low in my heart at the moment. I work too hard and study too hard at what I do to have someone who maybe read's a blog about SEO occasionally to come in and treat me like I have no idea what I'm talking about.
Thanks very much for responding. I appreciate it mucho!
Dana
-
Thanks Jason,
These are great suggestions and are exactly the kinds of things that will give me the proof I need to convince him that removing these is a worthwhile endeavor. I'm off to do them now and will come back here and post my discoveries.
Dana
-
Are these those thin content, duplicate content, review and email pages?
There are links into those pages that are evaporating pagerank.
Two links on each of your product pages are being wasted.
If they are getting indexed then they are dead weight on your site and make your site look like a skimpy spammy publisher.
-
By "orphaned" do you mean pages that are no longer linked to your site navigation taxonomy?
If you know the subject matter and/or URLs, you can easy show your boss that they are indexed: Google "site:oursite.com orphaned topic" and show him all the pages in the google index.
If you can't find the pages, then do a complete crawl of your site with Screaming Frog and see how many pages it finds. Now compare that number with how many pages Google has in your index in Google Webmaster Tools (under Health -> Index Status). Google Count of Pages - Screaming Frog count of Pages = # of Orphaned Pages.
Now to see if those pages are hurting you, run them through Open Site Explorer to see if any of them have backlinks. If so, they are diluting your SEO efforts. Even if not, look at your crawl stats in Google Webmaster tools under Health and see how many pages you're getting crawled per day. If it's a fraction of your total pages, then if you got rid of the orphaned pages, you could be getting your important pages crawled more regularly.
I hope that helps.
Jason "Retailgeek" Goldberg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Country and Language tags-Running an SEO audit on a site that definitely has more than one language, but nothing is pulling up. I don't quite understand href lang or how to go about it. HELP HELP!
Ran an SEO audit and I don't really understand country and language tags. For example, sony.com definitely has more than one language, but how do I seo check href lang ? Do I inspect the page? etc?
Technical SEO | | Mindgruver0 -
Any SEO-wizards out there who can tell me why Google isn't following the canonicals on some pages?
Hi, I am banging my head against the wall regarding the website of a costumer: In "duplicate title tags" in GSC I can see that Google is indexing a whole bunch parametres of many of the url's on the page. When I check the rel=canonical tag, everything seems correct. My costumer is the biggest sports retailer in Norway. Their webshop has approximately 20 000 products. Yet they have more than 400 000 pages indexed by Google. So why is Google indexing pages like this? What is missing in this canonical?https://www.gsport.no/herre/klaer/bukse-shorts?type-bukser-334=regnbukser&order=price&dir=descWhy isn't Google just cutting off the ?type-bukser-334=regnbukser&order=price&dir=desc part of the url?Can it be the canonical-tag itself, or could the problem be somewhere in the CMS? Looking forward to your answers Sigurd
Technical SEO | | Inevo0 -
Page speed in relation to SEO
I cannot seem to find any information about this, so I thought I would try to get a few people's opinion. How do you think pagespeed is measured in terms of Google using it as a ranking factor? Do you think they use their internal Pagespeed app? Something during the crawl? Your GA site speed?
Technical SEO | | LesleyPaone0 -
Problem of possible duplicate title tag and description. Help me!
Hi everybody, I'm optimizing this huge website that has a lot of identical categories for differente locations. I'm trying to find a smart way to write title and description for these categories, changing the location as a variable on the title and description phrase. Here some examples: Title: Attractions in [CITY]. Sightseeings, monuments and museums in [CITY]. Description: Find travel ideas and suggestions for [CITY]. On [NAME OF THE WEBSITE] you can find a lot of attractions, monuments and sightseeing off the beaten path in [CITY]. Changing only the name of the CITY on these Titles and Descriptions, am I running the risk of duplicate title and description? Thanks in advance for your help!
Technical SEO | | OptimizedGroup0 -
HTTP 500 Internal Server Error, Need help
Hi, For a few days know google crawlers have been getting 500 errors from our dedicated server whenever they try to crawl the site. Using the "Fetch as Google" tool under health in webmaster tools, I get "Unreachable page" every time I fetch the homepage. Here is exactly what the google crawler is getting: <code>HTTP/1.1 500 Internal Server Error Date: Fri, 21 Jun 2013 19:52:27 GMT Server: Apache/2.2.15 (CentOS) X-Powered-By: PHP/5.3.3 X-Pingback: [http://www.communityadvocate.com/xmlrpc.php](http://www.communityadvocate.com/xmlrpc.php) Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=UTF-8 http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> My url is [http://www.communityadvocate.com](http://www.communityadvocate.com/)</code> and here's the screenshot from Goolge webmater http://screencast.com/t/FoWvqRRtmoEQ How can i fix that? Thank you
Technical SEO | | Vmezoz0 -
Splitting Page Authority with two URLs for the same page.
Hello guys, My website is currently holding two different URLs for the same page and I am under the impression such set up is dividing my Page Authority and Link Juice. We currently have the following page with both URLs below: www.wbresearch.com/soldiertechnologyusa/home.aspx
Technical SEO | | JoaoPdaCosta-WBR
www.wbresearch.com/soldiertechnologyusa/ Analysing the page authority and backlinks I identified that we are splitting the amount of backlinks (links from sites, social media and therefore authority). "/home.aspx"
PA: 67
Linking Root Domains: 52
Total Links: 272 "/"
PA: 64
Linking Root Domains: 29
Total Links: 128 I am under the impression that if the URLs were the same we would maximise our backlinks and therefore page authority. My Question: How can I fix this? Should I have a 301 redirect from the page "/" to the "/home.aspx" therefore passing the authority and link juice of “/” directly to “/homes.aspx”? Trying to gather thoughts and ideas on this, suggestions are much appreciated? Thanks!0 -
2 links on home page to each category page ..... is page rank being watered down?
I am working on a site that has a home page containing 2 links to each category page. One of the links is a text link and one link is an image link. I think I'm right in thinking that Google will only pay attention to the anchor text/alt text of the first link that it spiders with the anchor text/alt text of the second being ignored. This is not my question however. My question is about the page rank that is passed to each category page..... Because of the double links on the home page, my reckoning is that PR is being divided up twice as many times as necessary. Am I also right in thinking that if Google ignore the 2nd identical link on a page only one lot of this divided up PR will be passed to each category page rather than 2 lots ..... hence horribly watering down the 'link juice' that is being passed to each category page?? Please help me win this argument with a developer and improve the ranking potential of the category pages on the site 🙂
Technical SEO | | QubaSEO0 -
Planing Seo For New Seo
Hello; I have the domain which registerd in 2006 and i opened website 1 months ago and i start to do some seo like bought links pr1-pr7 50 links and 2500 social bookmarks 2000 blog links and also some wiki links am i doing good or bad ?
Technical SEO | | Sadullah0