Google has indexed some of our old posts. What took so long and will we lose rank for their brevity?
-
Hi,
We just had a few of our old blog posts indexed by Google. There are short formed posts, and I want to make sure we're not going to get dinged by Google for their length. Can you advise?https://www.policygenius.com/blog/guaranteed-issue
-
No problem. Screaming Frog (or any crawler) won't pick it up, because it's not being linked to within the website (it's an "orphaned" page).
Google could still index them because they are in the sitemap, but it took so long because they are no actually linked to from the website.
So... if it's not supposed to be indexed at all in the first place, you can add a meta "noindex" tag to the page and remove it from the sitemap. Then you'll be all set
-
Thanks, Dan!
Is there a reason why Screaming Frog wouldn't pick up, even though it was on the sitemap to be crawled?
Google has actually indexed two pages for these pages here (trailing slash and non-trailing slash). We've got our 301 redirects and canonical tags setup correctly though.
To add another layer to this puzzle looks like this URL shouldn't be indexed at all, as it is instead of the home of content referenced in other places, like here: https://www.policygenius.com/blog/glossary/life-insurance.
Appreciate the help!
Looks like this URL shouldn't be indexed at all, as it is instead of the home of content referenced in other places, like here: https://www.policygenius.com/blog/glossary/life-insurance.
-
It was probably indexed so late because Google couldn't find it
I just crawled the whole site with Screaming Frog and that URL wasn't picked up in the crawl --> http://screencast.com/t/xzunkNR3K
But it's in your sitemap --> https://www.policygenius.com/blog/post-sitemap.xml - so this makes total sense why it took Google so long to find it
-
It depends on how accessible they are to search engines. If you've recently updated your sitemap, and those posts are on the new one, but weren't on the old one, that could cause it. New internal/external links pointing to those pages could have helped as well.
-
Thanks for the responses! Is there a reason why some of these pages were indexed so late (we published them several months ago)? We've had some newer posts index quicker than this one. I know there's not an exact reason, but trying to get some insights.
Thanks!
-
Hi Francois,
As Logan said, the problem comes when the whole site has short content issue.
I might ask, if that post is ranking well, why dont you edit it and creat some valuable content? Google loves when re-editing and improving your content.
Hope it helps.
GR. -
Hi,
As long as that's not the bulk of your content, I don't see it being a problem. Thin content penalties are more common when short-form content is the majority of a site. I've mostly seen this with ecommerce sites where product detail pages make up about 95% of the page count and the product descriptions are thin or non-existent. It's hard to be viewed as authoritative or trustworthy when only 5% of your pages have a decent amount of content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fresh backlinks vs old backlinks: A solid ranking factor?
Hi Moz community, Backlinks being a major ranking factor, do they must be very recent or fresh to make a ranking difference compared to the backlinks which are years old? We know usually fresh content ranks well, but I wonder how much the fresh/recent backlinks impact in rankings. Do the years old backlinks from related and reputed website have same impact on rankings? Thanks
Algorithm Updates | | vtmoz0 -
Do header tags impact the rankings much?
Hi all, I have gone through some posts and comments where it's been mentioned that header tags will be considered as any other content on page. Is that really true? Writing up more relevant header tags as per the page topics doesn't have any impact? I would like to know the updated importance of header tags in today's SEO. Thanks
Algorithm Updates | | vtmoz0 -
Do back-links to non indexed sub-domains / sub-directories considered by Google as website backlinks and pass Pagerank to website?
Hi, If some noindexed links on our website or sub-domain got some backlinks, will that backlinks pass Pagerank / linkjuice to website? Will they be considered as backlinks to website by Google? Here is a statement from Matt cutts for the question. My question is same as below with answer? Eric Enge: Can a NoIndex page accumulate PageRank? Matt Cutts: A NoIndex page can accumulate PageRank, because the links are still followed outwards from a NoIndex page. Thanks
Algorithm Updates | | vtmoz0 -
Could Retail Price Be A Google Ranking Factor???
I have not done any detailed studies on this but it seems that Google might be using low retail prices for specific items as a ranking factor in their organic SERPs. Does anyone else suspect this? Just askin' to hear your thoughts. Thanks!
Algorithm Updates | | EGOL0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
How to Recover Ranking from Latest Google Panda and EMD Update?
We have 7 websites with the exact match domains. Our Website has been affected by Google recently updated (Panda and EMD Update). we don't want to do any changes in our existing domains. What should we do? So, What is the exact solution for that. Help Us out! Website Names: http://www.hamptoninndenverairport.com http://www.wingatehotelcolumbia.com http://www.fairfieldinnhotelaurora.com http://www.hamptoninnhotelcrestwood.com http://www.laquintahoteldavenport.com http://www.redroofinnhotelcedarrapids.com http://www.hamptoninnhotelcarolstream.com Thanks in advance.
Algorithm Updates | | CommercePundit1 -
Google.ca English and French returning different rankings
French Keyword : "Chauffage électrique" Currently Ranking 4th on Google.ca (French) It is not even top 50 on Google.ca (English) Why so much gap between them? Both are on Google.ca, just different language. Also, when searching the keyword on Google.ca (English), all the results shown are in french anyway ! Why is mine way off ? How can I help the ranking on the EN version? Why does Google.ca FR and EN have different rankings?
Algorithm Updates | | Kezber0 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0