301s Creating Soft 404s in GWT
-
Hi,
We re-did a section of a site and got rid of hundreds of pages of no longer relevant content. We 301'd the urls to the category homepage. Now, GWT calls these soft 404s.
a) Should we have done something differently instead of 301ing?
b) Are these hundreds of soft 404 errors a big problem or threat to how Google sees us for SEO?
c) Should we correct this in some way?
Thanks... Darcy
-
Can you provide a few example URL's of this error. I will take a closer look and give you a better response.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating Individual Posts w/ Excerpts from Expert Columns On Your Website
Hey Moz Fam! I have a question regarding adding excerpts from guest posts/expert columns to your website. The reason this came up to begin with is we want to find a way to track ALL our social media posts engagement. When we share guest posts links, we obviously cannot track when someone clicks it since its not going to our own website. The idea was if we create an individual page with an excerpt from the guest post, then add a "read full post" button that links back to it, we could then instead share that excerpt post from our website to social media so we can track it. Then the user could go to view the full post on the actual guest article from there if they wish. We wanted to create a page titled "Media" and have all those posts categorized to fall on that page. Then canonicalize all of the individual posts to the top level media page. My question is... we've spent a lot of time and money on creating pages with unique quality content and updating any thin content to make it not thin. These new excerpt pages will be thin content, is that going to negatively impact our SEO? We're not trying to rank these excerpt pages or anything. I just want to make sure that adding these excerpt posts with only 100-200 words each is not going to hurt our overall SEO strategy.
Intermediate & Advanced SEO | | LindsayE0 -
How to Handle a Soft 404 error to an admin page in WordPress
I'm seeing this error on Google Webmaster Console: | URL: | http://www.awlwildlife.com/wp-admin/admin-ajax.php | | | Error details | Linked from | | |
Intermediate & Advanced SEO | | aj613
| Last crawled: 11/15/16First detected: 11/15/16 The target URL doesn't exist, but your server is not returning a 404 (file not found) error. Learn more Your server returns a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404). This creates a poor experience for searchers and search engines. More information about "soft 404" errors | Any ideas what I should do about it? Thanks!0 -
GWT does not play nice with 410 status code approach to expire content? Use 301s?
We have been diligently managing our index size in Google for our sites and are returning a 410 status code for pages that we no longer consider "up-to-date" but still carry value for users to access to have Google remove them from our index to keep it lean. However we have been receiving GWT warning across sites because of the 410 status codes Google is encountering which makes us nervous that Google could interpret this approach as a lack of quality of our site. Does anyone have a view if the 410 approach is the right approach for the given example or if we should consider maybe simply using 301s or another status code to keep our GWT errors clean? Further notes there is hardly ever any link juice being sent to those pages so it is not like we are missing out on that the pages for which we return 410 are also marked as noindex and nofollow
Intermediate & Advanced SEO | | petersocapro0 -
Random post plugin creates 302 redirects. What should I do?
Just started work on a great MMA news site. In their footer, they have a plugin for random posts, which creates URL strings with '?random=1' on the end and then 302 redirects to a random article on the site. I know SEO-friendly protocol for redirects is to use 301 and not any of the other 300's. However, I don't really see the need to do 301s for these because of the fact that they are random! That said, I also don't want to leave 1000s of errors that can hinder the 'crawlability' (don't judge me - that's a word :)) of my client's site. My thought right now is to noindex the urls with the '?random=1' in the string, so the spider doesn't worry about crawling those links. Not sure if that is proper code, but it seems that would be quick and effective. Is there a better way to attack this? If you know, please share with me! WP publishers who use random post plugins: have you experienced this? How did you fix it within the friendly confines of Wordpress?
Intermediate & Advanced SEO | | Netrepid0 -
Manual action penalty revoked, rankings still low, if we create a new site can we use the old content?
Scenario:
Intermediate & Advanced SEO | | peteboyd
A website that we manage was hit with a manual action penalty for unnatural incoming links (site-wide). The penalty was revoked in early March and we're still not seeing any of our main keywords rank high in Google (we are found on page 10 and beyond). Our traffic metrics from March 2014 (after the penalty was revoked) - July 2014 compared to November 2013 - March 2014 was very similar. Question: Since the website was hit with a manual action penalty for unnatural links, is the content affected as well? If we were to take the current website and move it to a new domain name (without 301 redirecting the old pages), would Google see it as a brand new website? We think it would be best to use brand new content but the financial costs associated are a large factor in the decision. It would be preferred to reuse the old content but has it already been tarnished?0 -
Targeting local areas without creating landing pages for each town
I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages). Then along came Panda... I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name. My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products. Next I have rewritten the content for every product to ensure they are now as individual as possible. However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to make these pages as individual as possible. The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too. QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns? I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once. Any examples of big sites that have reduced in size since Panda would be great. I have a headache... Thanks community.
Intermediate & Advanced SEO | | Silkstream0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Sites with dynamic content - GWT redirects and deletions
We have a site that has extremely dynamic content. Every day they publish around 15 news flashes, each of which is setup as a distinct page with around 500 words. File structure is bluewidget.com/news/long-news-article-name. No timestamp in URL. After a year, that's a lot of news flashes. The database was getting inefficient (it's managed by a ColdFusion CMS) so we started automatically physically deleting news flashes from the database, which sped things up. The problem is that Google Webmaster Tools is detecting the freshly deleted pages and reporting large numbers of 404 pages. There are so many 404s that it's hard to see the non-news 404s, and I understand it would be a negative quality indicator to Google having that many missing pages. We were toying with setting up redirects, but the volume of redirects would be so large that it would slow the site down again to load a large htaccess file for each page. Because there isn't a datestamp in the URL we couldn't create a mask in the htaccess file automatically redirecting all bluewidget.com/news/yymm* to bluewidget.com/news These long tail pages do send traffic, but for speed we only want to keep the last month of news flashes at the most. What would you do to avoid Google thinking its a poorly maintained site?
Intermediate & Advanced SEO | | ozgeekmum0