Better to re-direct to a completely un-related page or 404?
-
We have about 1000 pages we need to eliminate from our site (of about 18000 URLs). these URLs don't see a ton of traffic, but may have some valuable links.
Would we be better to 404 these or re-direct them to our homepage? Could re-directing to our homepage hurt us?
-
why do you think it would be better to space it out?
-
Personally, I would 404 these pages.
But first, I would spend a good amount of time looking for incoming links. Perhaps take a random 100 of your 1,000 pages and do a backlink check on all of them. Is nobody linking to them....and they're not generating traffic....and you don't have a new page to replace it....then just 404 them.
However, if you find out that magically these pages have lots of links, then you'll want to put on your thinking cap. Might it be worth creating a new page to point those links to? Or, are you better off just reaching out to these folks and asking them to link to a different page on your site?
Also, on a semi related note, I wouldn't 404 all 1,000 pages at the same time. Do a few hundred at a time and see if anything happens. If there's no huge rush, it's better to space it out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Related searches volume
Hello, The related searches we see at the bottom of google search results. Is this what people also search for ? If it is why does the keyword so no volume for those related searches ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Ecommerce category pages
Hi there, I've been thinking a lot about this lately. I work on a lot of webshops that are made by the same company. I don't like to say this, but not all of their shops perform great SEO-wise. They use a filtering system which occasionally creates hundreds to thousands of category pages. Basically what happens is this: A client that sells fashion has a site (www.client.com). They have 'main categories' like 'Men' 'Women', 'Kids', 'Sale'. So when you click on 'men' in the main navigation, you get www.client.com/men/. Then you can filter on brand, subcategory or color. So you get: www.client.com/men/brand. Basically, the url follows the order in which you filter. So you can also get to 'brand' via 'category': www.client.com/shoes/brand Obviously, this page has the same content as www.client.com/brand/shoes or even /shoes/brand/black and /men/shoes/brand/black if all the brands' shoes happen to be black and mens' shoes. Currently this is fixed by a dynamic canonical system that canonicalizes the brand/category combinations. So there can be 8000 url's on the site, which canonicalize to about 4000 url's. I have a gut feeling that this is still not a good situation for SEO, and I also believe that it would be a lot better to have the filtering system default to a defined order, like /gender/category/brand/color so you don't even need to use these excessive amounts of canonicalization. Because, you can canonicalize the whole bunch, but you'd still offer thousands of useless pages for Google to waste its crawl budget on. Not to mention the time saved when crawling and analysing using Screaming Frog or other audit tools. Any opinions on this matter?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
500 and 508 pages?
Hi we just did a massive deepcrawl (using the tool deepcrawl.co.uk/) on the site: http://tinyurl.com/nu6ww4z http://i.imgur.com/vGmCdHK.jpg Which reported a lot of URLs as either 508 and 500 errors. For the URLs as reported as either 508 or 500 after the deep crawl crawl finished we put them directly into screaming frog and they all came back with status code 200. Could it be because Deep Crawl hammered the site and the server couldn't handle the load or something? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
YouTube Page
Hi All, I am new here but already I can see that SEOmoz is a great place for SEO 🙂 I need advice... We have one client that have 100.000 views per day on their YouTube channel! Now they have about 15.000 per day and ask us what we can do with SEO for their YouTube channel. Thanks for help! All The Best, Sanel
Intermediate & Advanced SEO | | FighterSpirit0 -
What constitutes a duplicate page?
Hi, I have a question about duplicate page content and wondered if someone is able to shed some light on what actually constitutes a "duplicate". We publish hundreds of bus timetable pages that have similar, but technically with unique urls and content. For example http://www.intercity.co.nz/travel-info/timetable/lookup/akl The template of the page is oblivious duplicated, but the vast majority of the content is unique to each page, with data being refreshed each night. Our crawl shows these as duplicate page errors, but is this just a generalisation because the urls are very similar? (only the last three characters change for each page - in this case /akl) Thanks in advance.
Intermediate & Advanced SEO | | BusBoyNZ0 -
Re-Direct Users But Don't Affect Googlebot
This is a fairly technical question... I have a site which has 4 subdomains, all targeting a specific language. The brand owners don't want German users to see the prices on the French sub domain and are forcing users into a re-direct to the relevant subddomain, based on their IP address. If a user comes from a different country, (ie the US) they are forced on the UK sub domain. The client is insistent on keeping control of who sees what (I know that's a debate in it's own right), but these re-directs we're implementing to make that happen, are really making it difficult to get all the subdomains indexed as I think googlebot is also getting re-directed and is failing to do it's job. Is there are a way of re-directing users, but not Googlebot?
Intermediate & Advanced SEO | | eventurerob0 -
Is there an optimal ratio of external links to a page vs internal links originating at that page ?
I understand that multiple links fro a site dilute link juice. I also understand that external links to a specific page with relevant anchortext helps ranking. I wonder if there is an ideal ratioof tgese two items
Intermediate & Advanced SEO | | Apluswhs0