Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move
It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googles Search Intent – Plural & Singular KW’s
This is more of a ‘gripe’ than a question, but I would love to hear people’s views. Typically, when you search for a product using the singular and plural versions of the keyword Google delivers different SERPs. As an example, ‘leather handbag’ and ‘leather handbags’ return different results, but surely the search intent is exactly the same? You’d have thought Google was now clever enough to work this out. We tend to optimise our webpages for both the plural and singular variations of the KW’s, but see a mixed bag of results when analysing rankings. Is Google trying to force us to create a unique webpage for the singular version, and another unique webpage for the plural version? This would confuse the visitor, and make no sense.. the search intent is the same! How do you combat this problem? Many thanks in advance. Lee.
Algorithm Updates | | Webpresence0 -
Anchor name URLs & anchor blocks: how Google sees them?
Hi guys, Anchor name URLs & anchor blocks: how Google sees them? As far as I know Google hasn't ever recommended anchor name URLs and anchor blocks, mostly when you have one page site, but I have ran into an organic result with an hyper-link to an anchor name URL. anchor name link There is a proper link and there aren't on the page and the code the words "Jump to". It means Google has put those words there and it has also taken the header of that block as anchor text. Why has Google placed that link? The query is "faqs umbrella company", so I thought that Google has seen "faqs umbrella company" like "what is the most popular faq about umbrella companies?" and therefore perhaps the correct answer could be "Is an umbrella company the only option I have? What are the alternatives?". Although, IMHO the most popular FAQ on Umbrella Companies should always be "what is an umbrella company". Unfortunately, that page is only worthy of third Google organic result page and there is no hint of rich snippet or any kind of conversational/KBT optimisation on its source code. no-rich-snippet Someone has any idea of why Google shows that link and if it's something that we can optimise in our pages? Cheers Pierpaolo IhwGwkb.jpg VWORt5F.jpg
Algorithm Updates | | madcow780 -
Will Russia's New Data Protection Law Impact SEOs and SMBs Outside of Russia?
We've all seen the news recently that Google will be closing its engineering offices in Russia due to new data protection laws coming into place in January 2015. The same law has also led to Adobe pulling out of Russia earlier in the year. I was wondering how you think this will impact SEOers and small/medium businesses that market _to _Russia, but are based outside of the country? Personal data has been defined in the new legislation as: Personal data means any information directly or indirectly related to any identified or potentially identifiable person. It includes, among other things, first name and family name, date and place of birth, address, information about family status, education, profession, income Source For those businesses which don't process personal data (affiliates etc), will there be any foreseeable impact? On the flipside, are there any benefits here for affiliate businesses inside of Russia? I'm using affiliates as an example to get the ball rolling, but I'm sure there's numerous more. Personally, I'd be interested to hear if you think this may impact corporate websites which don't process personal data, but operate outside of Russia.
Algorithm Updates | | ecommercebc0 -
Is this spamming keywords into a url?
My company has previously added on extensions to a url like the example below http://www.test.com/product-name/extra-keywords My question is since there is no difference between the pages http://www.test.com/product-name and http://www.test.com/product-name/extra-keywords and you don't leave the product page to reach the extra-keyword page is this really necessary? I feel like this is probably not a best practice. Thanks for any suggestions.
Algorithm Updates | | Sika220 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
Mobi sites and sitemaps
Hi all, How does should one treat mobi sites which have a separate set of files to the main site - with regards to the sitemap? Doe we tell Google about them?
Algorithm Updates | | gazza7770 -
Can someone explain a few hopefully simple questions for me please
Hi everyone First off for local seraches I rank very well pretty much all on first page and high up too. I am also attempting to rank well for the search term 'independent mortgage advice' I currently rank third on page 2 for the above search term. I am happy with this progress as the site is only 3 months old. I am UK based, have a .co.uk domain and although my site server is located in Germany (1and1) I have changed the geographical location in webmaster tools. My competitive domain analysis gives me the follwing results: Domain Authority: 14 Domain MozRank: 2.44 Total Links: 110 Ext. Followed Links: 19 Linking root Domains: 13 Followed Linking Root Domains:9 Linking C-Blocks: 8 Compared to my competitors around me these figures are terrible so why am I doing relatively well and how can I increase some of these figures such as Domain Authority & Domain Mozrank? The page I'm referring to is http://www.keystonemortgages.co.uk I am a novice so please don't mind calling me a numpty if it appears obvious to you Jason PS is it frowned upon to post links with title keywords here?
Algorithm Updates | | JasonHegarty0 -
No-follow tags on links in the footer...do it or don't do it?
With some of the great reports SEOMoz has provided I've been able to start to take the correct steps towards fixing crawl issues, on-page issues, etc. One of my websites allows a customer to drill down to their specific state and then their city to apply for an auto loan. The SEOMoz reports told me I had too many links on these pages specifically. One of my ways to remedy this would be to add "no-follow" tags on the links in the footer as well as the links to the cities. Am I steering myself in the right/wrong direction? Should I be approaching this problem from a different perspective? Any help is greatly appreciated!
Algorithm Updates | | fergseo0