Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should pages of old news articles be indexed?
-
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet.
I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content?
Thanks for the help!
-
Ah I'm sorry I misinterpreted you - so it's essentially about pagination? Rel Next/Rel Previous is probably the best way to go - the first page will be given the equity and the pages won't have to compete with each other for ranking. Google have a pretty comprehensive guide: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663744
-
Thanks Alice, but my question is about the page where the article is linked from not the actual article itself ( which 100% is staying indexed )
-
Hi Sara,
If the articles are time sensitive but high quality, I wouldn't noindex them. They could still have value in the future (for example, if a related story comes up, you can link back to the old article). You might also find ways to refresh or recycle them, such as adding a follow up, updating the information, or promoting a really great post "From Our Archives". They could also be a good longtail source of traffic for people looking for information on past news/events.
Google will be able to index old and outdated articles, but it's smart enough to know that these posts are old and outdated and therefore won't assign big chunks of page rank to them.
However if the articles are low quality, I would take action to improve the good content/poor content ratio. The ideal situation would be to improve the articles themselves, but that might not be a feasible solution if you've been publishing three per day for an extended period of time. I would conduct a thorough audit to see what content could be saved/improved and what content should be deleted. I wouldn't bother with no index or canonicals - if it's good content leave it up and let it be indexed, and if it's bad content that can't be saved, remove it.
Finally if you are redirecting old articles, I would be careful about where they redirect to. Ideally you'd want to redirect from a low quality article to a high quality article on the same subject. A big increase in URLs pointing to the main news page could raise a red flag, and could force readers to look for information unnecessarily.
Good luck!
-
The news articles themselves are not thin content, but the general pages are relatively thin because they only consist of the link + snippet.
-
Are they all thin content? If not, then I don't think it's necessary to NOINDEX them. If you think some of them don't have any real value, you could specifically NOINDEX them(and not all together). Google will crawl those pages no matter how deep they are, as long as they are accessible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Of Pages As HTTPS vs HTTP
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content. As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for. However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https. My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
Intermediate & Advanced SEO | | vikasnwu1 -
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Pages are Indexed but not Cached by Google. Why?
Here's an example: I get a 404 error for this: http://webcache.googleusercontent.com/search?q=cache:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all But a search for qjamba restaurant coupons gives a clear result as does this: site:http://www.qjamba.com/restaurants-coupons/ferguson/mo/all What is going on? How can this page be indexed but not in the Google cache? I should make clear that the page is not showing up with any kind of error in webmaster tools, and Google has been crawling pages just fine. This particular page was fetched by Google yesterday with no problems, and even crawled again twice today by Google Yet, no cache.
Intermediate & Advanced SEO | | friendoffood2 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Best way to get pages indexed fast?
Any suggestion on best ways to get new sites pages indexed? Was thinking getting high pr inbound links on fiverr but always a little risky right? Thanks for your opinions.
Intermediate & Advanced SEO | | mweidner27820 -
What do you do with outdated news and articles?
What do you guys do with your old content/news/articles? Do you just leave them on your site forever for historical reasons? It goes without saying that you wouldn't delete an article that has links pointing to it. But if there aren't any links, it doesn't rank and it doesn't receive traffic… do you just scrap it? How say you? Update: I would also like to throw in that I have a client who in 2006/2007 used content from another site. What would you do with that content after this amount of time? Bother with it?
Intermediate & Advanced SEO | | BeTheBoss0 -
Tool to calculate the number of pages in Google's index?
When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0