Redirecting thin content city pages to the state page, 404s or 301s?
-
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page.
Something like:
if (this city page should be removed) {
header("HTTP/1.0 404 Not Found");
header("Location:http://example.com/state-level-page")
exit();
}Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page?
Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway.
Thanks in advance!
-
Hello BarrelRoll42,
You should easily be able to find out if Google is indexing them by doing a site:yourdomain.com search on Google. But to answer your question, it sounds like you should probably delete them and let them 404. If Google HAS indexed them you may also need to use the URL Removal Tool in Google Webmaster Tools.
One last thing. Please do start a thread for your own question next time, as we try to keep it to one question per thread.
Thanks!
-
I'm dealing with a similar situation, thousands of low content city pages. There is almost 0 traffic or links to these pages, no human would ever navigate to them - in this case it would be best to just delete them? Do they need a 404? I'm not sure if Google is even indexing them.
-
Hi Daniel,
I am very happy I could be of help to you.
Sincerely,
Thomas
-
thanks, I've removed the redirects. I appreciate the advice!
-
Hi Daniel,
when setting up a 404 page you should have it directed to 404 never 200 and make sure there's nothing else occurring on that page for instance redirecting somebody somewhere else.
to answer your question directly I would eliminate the redirect.I hope this is been of help,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does redirecting a duplicate page NOT in Google‘s index pass link juice? (External links not showing in search console)
Hello! We have a powerful page that has been selected by Google as a duplicate page of another page on the site. The duplicate is not indexed by Google, and the referring domains pointing towards that page aren’t recognized by Google in the search console (when looking at the links report). My question is - if we 301 redirect the duplicate page towards the one that Google has selected as canonical, will the link juice be passed to the new page? Thanks!
Intermediate & Advanced SEO | | Lewald10 -
Redirects & Authority when Updating Product Pages
Hi Quick question on SEO & product pages. We're changing suppliers, so discontinuing their range, adding new - but the products will be very similar - almost identical in some cases. I don't want to lose authority built up from current product pages, the only way to reuse these pages is to reuse SKUs - which we can't do. If I am redirecting these pages to new products which are similar, I know page authority will be passed - so is this the best option? Our links on the website will actually point to the final URL, rather than going through a redirect - if this is the case will it still pass authority? Thank you Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Rotating Content Concern on Deep Pages
Hi there, I apologize if I'm too vague, but this is a tough issue describe without divulging too much of our project. I'm working on a new project which will provide information results in sets of 3. Let's say someone wants to find 3 books that match their criteria, either through their organic search which leads them to us, or through their internal search on our site. For instance, if they're looking for classic movies involving monsters, we might display Frankenstein, Dracula, and The Mummy. We'd list unique descriptions about the movies and include lots of other useful information. However, there are obviously many more monster movies than those 3, so when a user refreshes the page or accesses it again, a different set of results show up. For this example, assume we have 5 results to choose from. So it's likely Google will index different results shuffled around. I'm worried about this causing problems down the line with ranking. The meat and potatoes of the page content are the descriptions and information on the movies. If these are constantly changing, I'm afraid the page will look "unstable" to Google since we have no real static content beyond a header and title tag. Can anyone offer any insight to this? Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
How can you indexed pages or content on pages that are behind a pay wall or subscription login.
I have a client that has a boat of awesome content they provide to their client that's behind a pay wall ( ie: paid subscribers can only access ) Any suggestions mozzers? How do I get those pages index? Without completely giving away the contents in the front end.
Intermediate & Advanced SEO | | BizDetox0 -
Duplicate Content on Product Pages
I'm getting a lot of duplicate content errors on my ecommerce site www.outdoormegastore.co.uk mainly centered around product pages. The products are completely different in terms of the title, meta data, product descriptions and images (with alt tags)but SEOmoz is still identifying them as duplicates and we've noticed a significant drop in google ranking lately. Admittedly the product descriptions are a little bit thin but I don't understand why the pages would be viewed as duplicates and therefore can be ranked lower? The content is definitely unique too. As an example these three pages have been identified as being duplicates of each other. http://www.outdoormegastore.co.uk/regatta-landtrek-25l-rucksack.html http://www.outdoormegastore.co.uk/canyon-bryce-adult-cycling-helmet-9045.html http://www.outdoormegastore.co.uk/outwell-minnesota-6-carpet-for-green-07-08-tent.html
Intermediate & Advanced SEO | | gavinhoman0 -
Too many on page links - product pages
Some of the pages on my client's website have too many on page links because they have lists of all their products. Is there anything I should/could do about this?
Intermediate & Advanced SEO | | AlightAnalytics0 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0