Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What should be done with old news articles?
-
Hello,
We have a portal website that gives information about the industry we work in. This website includes various articles, tips, info, reviews and more about the industry.We also have a news section that was previously indexed in Google news but is not for the past few month.The site was hit by Panda over a year ago and one of the things we have been thinking of doing is removing pages that are irrelavant/do not provide added value to the site.Some of these pages are old news articles posted over 3-4 years ago and that have had hardly any traffic to.All the news articles on the site are under a /archive/ folder sorted by month and year, so for example a url for a news item from April 2010 would be /archive/042010/article-nameMy question is do you think removing such news articles would benefit the site helping it get out of Panda (many other things have been done in the site as well), if not what is the best suggested way to keep these articles on the site in a way which Google indexes them and treats them well.thx
-
Basically I don't see a reason to remove old news articles from a site, as it makes sense to still have an archive present. The only reason I could think of to remove them is if they are duplicate versions of texts that have originally been published somewhere else. Or if the quality is really crap...
-
if the articles are good - then there just might be value to the user . Depending on the niche / industry those old articles could be very important.
Google dosen't like those as you probably have a lot of impression but no clicks (so mainly no traffic) or maybe the "score" is bad (bounce rate - not Google analytics bounce rate, but Google's bounce rate - if they bounce to serps that is).
Since you got hit by panda, in my opinion, I see two options:
1. No index those old pages. The users can still get tho those by navigation, site search etc but google won't see them. Google is fine with having content (old, poor, thin etc) if it's not in the index. I work with a site that has several million pages and 80% is no index - everything is fine now (they also got hit by Panda).
2. Merge those pages into rich, cool, fresh topic pages (see new york time topic pages sample - search for it - I think there is also an seomoz post - a whiteboard friday about it). This is a good approach and if you manage to merge those old pages with some new content you will be fine. Topic pages are great as an anti panda tool !
If you merge the pages into topic pages do that based on a simple flow:
1. identify a group of pages that covers the same topic.
2. identify the page that has the highest authority of all.
3. Change this page into the topic page - keep the url.
4. Merge the other into this page (based on your new topic page structure and flow)
5. 301 redirect the others to this one
6. build a separat xml sitemaps with all those pages and load it up to WMT. Monitor it.
7. Build some links to some of those landing pages, get some minimum social signals to those - to a few (depending on the number). Build an index typoe of page with those topic pages or some of them (user friendly one/ ones) and use those as target to build some links to send the 'love'.
Hope it helps - just some ideas.
-
I do think that any site should remove pages that are not valuable to users.
I would look for the articles that have external links pointed at them and 301 those to something relevant. The rest, you could simply remove and let them return a 404 status. Just make sure all internal links pointing at them are gone. You don't want to lead people to a 404 page.
You could consider putting /archive/ in your robots.txt file if you think the pages have some value to users, but not to the engines. Or putting a no index tag on each page in that section.
If you want to keep the articles on the site, available to both google and users, you have to make sure they meet some of this basic criteria.
- Mostly Unique Content
- Moderate length.
- Good content to ad ratio.
- Content the focus on the page (top/center)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fresh page versus old page climbing up the rankings.
Hello, I have noticed that if publishe a webpage that google has never seen it ranks right away and usually in a descend position to start with (not great but descend). Usually top 30 to 50 and then over the months it slowly climbs up the rankings. However, if my page has been existing for let's say 3 years and I make changes to it, it takes much longer to climb up the rankings Has someone noticed that too ? and why is that ?
Intermediate & Advanced SEO | | seoanalytics0 -
How old is 404 data from Google Search Console?
I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?
Intermediate & Advanced SEO | | McTaggart0 -
Magento: Should we disable old URL's or delete the page altogether
Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?
Intermediate & Advanced SEO | | andyheath0 -
Moving to a new site while keeping old site live
For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?
Intermediate & Advanced SEO | | kdaniels0 -
Redirect old .net domain to new .com domain
I have a quick question that I think I know the answer to but I wanted to get some feedback to make sure or see if there's additional feedback. The long and short of it is that I'm working with a site that currently has a .net domain that they've been running for 6 years. They've recently bought a .com of the same name as well. So the question is: I think it's obviously preferable to keep the .net and just direct the .com to it. However, if they would prefer to have the .com domain, is 301'ing the .net to the .com going to lose a lot of the equity they've built up in the site over the past years? And are there any steps that would make such a move easier? Also, if you have any tips or insight just into a general transition of this nature it would be much appreciated. Thanks!
Intermediate & Advanced SEO | | BrandLabs0 -
Should pages of old news articles be indexed?
My website published about 3 news articles a day and is set up so that old news articles can be accessed through a "back" button with articles going to page 2 then page 3 then page 4, etc... as new articles push them down. The pages include a link to the article and a short snippet. I was thinking I would want Google to index the first 3 pages of articles, but after that the pages are not worthwhile. Could these pages harm me and should they be noindexed and/or added as a canonical URL to the main news page - or is leaving them as is fine because they are so deep into the site that Google won't see them, but I also won't be penalized for having week content? Thanks for the help!
Intermediate & Advanced SEO | | theLotter0 -
Old deleted sitemap still shown in webmaster tools
Hello I have redisgned a website inl new url structure in cms. Old sitemap was not set to 404 but changed with new sitemap files,also new sitemap was named different to old one.All redirections done properly Still 3 month after google still shows me duplicate titile and metas by comparing old and new urls I am lost in what to do now to eliminate the shown error. How can google show urls that are not shown in sitemap any more? Looking forward to any help Michelles
Intermediate & Advanced SEO | | Tit0 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0