Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
-
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce...
I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content.
If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content?
Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated.
Thx
-
Sending you a PM
-
You are welcome!
Still get that traffic in the move It's free traffic, try to make the most out of it. Find the best way to point them in the direction you need them to go always keeping an eye in being as friendly and natural as possible.
-
Good plan actually, I appreciate it. I dev'd my own sitemap script but agree xml-sitemaps works great. I suggest that to friends & clients needing an easy solution.
Giving the analytics... I did't want to update roughly 400 pages. However, you handed me my resolution... I'll wrap the old pages with my up to date header/footer & just make some banners that direct traffic to the updated website.
Note: Making a basketball/shoe analogy... Just assume I'm selling Nike Shoes & traffic lands on my 1995,1996,1997 etc Charles Barkley pages. I don't sell shoes, and my query reports & analytics show people arent searching for Barkley but because of the age and trust of my page, engines still point them there.
Anyway, I appreciate it a lot. Over complicated things this time !
-
I don't think messing with your sitemap will work. Google serves what they think is better to the user, even if it is old content.
You have several options here to go for:
- Make a full sitemap automatically that will assign priority automatically like the one provided by xml-sitemaps.com (incredible software in my personal opinion and well worth the money).
- Update the content on those pages you say it's outdated. I think Google prefers serving pages that have huge value instead of "new", therefore, updating the content of those pages may decrease your bounce rate.
- While on the old pages, link to the new posts that include the new info. You can even put something like "This content is outdated, for the up-to-date version, click here" and link to the most appropriate new page, you keep the page, no 301s and pass some juice to the new page.
I think the best would be to use the 1st and 2nd options in conjunction. Or 1st and 3rd if the content of the "old" pages have something that updating them will loose their value.
In any case, I wouldn't leave pages out of the sitemap. The software I mentioned automatically assigns priority as to "how deep the page is in your site" (links it needed to follow to reach that page, older pages will surely need more clicks to reach to them).
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thoughts on Google's Autocomplete hurting organic SEO?
A client sent over an article about how Google's Autocomplete eliminates your chance for clicks. Saying that if your competitor is higher than you, the user will bypass the page one organic rank and click on a specific business from the autocomplete which in turn presents an entire page one result for that business. So in a sense they are wondering why they're doing organic SEO if potential customers are just going to bypass the page one organic results. I would love to hear thoughts from like minded people on this as I have to start proving my case with articles, facts, data, and research.
Algorithm Updates | | MERGE-Chicago0 -
What to do with old, outdated and light content on a blog?
So there's a blog I recently took over - that over the past 2 years has great content. However, with their 800+ published posts. I'd say that 250-300 posts are light in content, that's nothing more than a small paragraph with no real specificity on what its about - more like general updates. Now what would best practice be; optimizing all of the posts or deleting the posts and 301'ing the URL to another post/the root?
Algorithm Updates | | simplycary0 -
Crosslinking & Managing Multiple Domains in Same Webmaster Tool's Account
I am wondering if there are any consequences if you manage multiple websites in the same Webmaster Tool's account and cross link between them? My guess is that this would be a very easy thing for Google to detect and build into their algorithms. Hence affect the link juice from those domains that are owned by the same person. I am looking for verification on this. Thanks, Joe
Algorithm Updates | | csamsojo0 -
Canonical URLs being ignored?
Hi Guys, Has anybody noticed canonical URLs being ignored where they were previously obeyed? I have a site that is doing this at the moment and just wondered if this was being seen elsewhere and if anyone knows what the solution is? Thanks, Elias
Algorithm Updates | | A_Q0 -
How big is the effect of having your site hosted in the country you're targeting?
Other than having a ccTLD domain and assigning your target country in Google Webmaster Tools' "geotargeting" feature, how big is the effect of having your site hosted in the country you're targeting? Is it really necessary? or it is just a small signal? Thanks in advance! 🙂
Algorithm Updates | | esiow20131 -
Vanity URL's and http codes
We have a vanity URL that as recommended is using 301 http code, however it has been discovered the destination URL needs to be updated which creates a problem since most browsers and search engines cache 301 redirects. Is there a good way to figure out when a vanity should be a 301 vs 302/307? If all vanity URL's should use 301, what is the proper way of updating the destination URL? Is it a good rule of thumb that if the vanity URL is only going to be temporary and down the road could have a new destination URL to use 302, and all others 301? Cheers,
Algorithm Updates | | Shawn_Huber0 -
Algorithm Question
Hello, I'm trying to figure out why my site and a competitors site literally trade places in the SERPS every week? My site has a boatload of content with an active blog with a lot of links versus a competitor that has virtually no links, limited content, 404 errors on the site. The links I have are all natural links with the exception of directories like JoeAnt. The only thing their site has is Domain Age where as mines is 4 months old. Any insight?
Algorithm Updates | | bronxpad0 -
How did a competitor's brand name get in google's related search list?
When doing a google search for the term "ulster county real estate" the related search list at the bottom of the serp includes 7 obviously related search terms and 1 brand name of a competitor. (see attachment) The competitor doesn't rank for this term organically at all yet he enjoys a link on the first page with those of us that do by being in the related search list? I don't get it. Anyone know how something like this happens? Innhs.png
Algorithm Updates | | jhogan801