Do I need to disallow the dynamic pages in robots.txt?
-
Do I need to disallow the dynamic pages that show when people use our site's search box? Some of these pages are ranking well in SERPs. Thanks!
-
These pages that produce soft 404 errors don't show products at all because these people search for our products that are not available.
-
Yes, done that.
-
Just having a quick look what Google say about them:
Here’s a list of steps to correct soft 404s to help both Google and your users:
- Check whether you have soft 404s listed in Webmaster Tools
- For the soft 404s, determine whether the URL:
- Contains the correct content and properly returns a 200 response (not actually a soft 404)
- Should 301 redirect to a more accurate URL
- Doesn’t exist and should return a 404 or 410 response
- Confirm that you’ve configured the proper HTTP Response by using Fetch as Googlebot in Webmaster Tools
- If you now return 404s, you may want to customize your 404 page to aid your users. Ourcustom 404 widget can help.
Have you followed these steps?
Andy
-
These soft 404s produce 200 status code. We already improved our pages when someone finds a product that is not on our list. But then, these dynamic pages are still considered as soft 404s by Google webmaster tools.
-
Well, I would try and fix why they are returning 404's as it would be a shame to block all results. Is this something you can do? Or is the a reason why just blocking is preferred?
-
Yeah, some of them produce soft 404 since there's no content at all but some dynamic pages that rank well show content.
Thanks,
JC
-
OK so when you search, you get back dynamic pages that are producing 404's, but you see the pages in the SERPs?
Just want to make sure i have this right
-
I agree with Andy. Many of our search result pages rank well (and actually convert quite well). I don't think you need to disallow them unless it's for content that doesn't exist. Even at that time, you may still want them up because you may offer complementary products and etc.
-
The reason why we want to block those pages is because they produce soft 404 errors. What should we do? Thanks Andy.
-
If they are ranking well, what is the reason for wanting to block them?
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advice needed on canonical paginated pages
Hi there. I use Genesis and StudioPress themes. I recently noticed that the canonical link for blog pages points to the first page on all paginated pages, which I understand is an SEO no-no. I found some code here that adds a unique canonical link to each paginated page but for categories only. It works fine. I only have one category for my site. My question is: is there a downside (or even upside) to not having a blog page and placing a link to my category page in the navigation bar instead, using the category page as the blog page? It looks good and works. What do you think? I find it odd that this seems to be an issue across the Internet and the only solution that comes up relies on the Yoast plugin, which I don't want to use (don't want to use a plugin for SEO). Thanks in advance.
Intermediate & Advanced SEO | | Nobody16165422281340 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
404 Pages. Can I change it to do this without getting penalized ? I want to lower our bounce rate from these pages to encourage the user to continue on the site
Hi All, We have been streaming our site and got rid of thousands of pages for redundant locations (Basically these used to be virtual locations where we didn't have a depot although we did deliver there and most of them was duplicate/thin content etc ). Most of them have little if any link value and I didn't want to 301 all of them as we already have quite a few 301's already We currently display a 404 page but I want to improve on this. Current 404 page is - http://goo.gl/rFRNMt I can get my developer to change it, so it will still be a 404 page but the user will see the relevant category page instead ? So it will look like this - http://goo.gl/Rc8YP8 . We could also use Java script to show the location name etc... Would be be okay ? or would google see this as cheating. basically I want to lower our bounce rates from these pages but still be attractive enough for the user to continue in the site and not go away. If this is not a good idea, then any recommendations on improving our current 404 would be greatly appreciated. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Should I use meta noindex and robots.txt disallow?
Hi, we have an alternate "list view" version of every one of our search results pages The list view has its own URL, indicated by a URL parameter I'm concerned about wasting our crawl budget on all these list view pages, which effectively doubles the amount of pages that need crawling When they were first launched, I had the noindex meta tag be placed on all list view pages, but I'm concerned that they are still being crawled Should I therefore go ahead and also apply a robots.txt disallow on that parameter to ensure that no crawling occurs? Or, will Googlebot/Bingbot also stop crawling that page over time? I assume that noindex still means "crawl"... Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
Page Count in Webmaster Tools Index Status Versus Page Count in Webmaster Tools Sitemap
Greeting MOZ Community: I run www.nyc-officespace-leader.com, a real estate website in New York City. The page count in Google Webmaster Tools Index status for our site is 850. The page count in our Webmaster Tools Sitemap is 637. Why is there a discrepancy between the two? What does the Google Webmaster Tools Index represent? If we filed a removal request for pages we did not want indexed, will these pages still show in the Google Webmaster Tools page count despite the fact that they no longer display in search results? The number of pages displayed in our Google Webmaster Tools Index remains at about 850 despite the removal request. Before a site upgrade in June the number of URLs in the Google Webmaster Tools Index and Google Webmaster Site Map were almost the same. I am concerned that page bloat has something to do with a recent drop in ranking. Thanks everyone!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
Robots.txt Blocked Most Site URLs Because of Canonical
Had a bit of a "Gotcha" in Magento. We had Yoast Canonical Links extension which worked well , but then we installed Mageworx SEO Suite.. which broke Canonical Links. Unfortunately it started putting www.mysite.com/catalog/product/view/id/516/ as the Canonical Link - and all URLs with /catalog/productview/* is blocked in Robots.txt So unfortunately We told Google that the correct page is also a blocked page. they haven't been removed as far as I can see but traffic has certainly dropped. We have also , at the same time had some Site changes grouping some pages & having 301 redirects. Resubmitted site map & did a fetch as google. Any other ideas? And Idea how long it will take to become unblocked?
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
How can you indexed pages or content on pages that are behind a pay wall or subscription login.
I have a client that has a boat of awesome content they provide to their client that's behind a pay wall ( ie: paid subscribers can only access ) Any suggestions mozzers? How do I get those pages index? Without completely giving away the contents in the front end.
Intermediate & Advanced SEO | | BizDetox0 -
To land page or not to land page
Hey all, I wish to increase my sites rankings on a variety of keywords within sub categories but I'm unsure where to be spending the time in SEO. Here's an example of the website page structure: General Home Page > Sub Category 1 Home Page
Intermediate & Advanced SEO | | DPSSeomonkey
> Searching / Results pages
- Sub Category 1
- Sub Category 2
- Sub Category 3
- Sub Category 4 > Sub Category 2 Home Page
> Searching / Results pages
- Sub Category 1
- Sub Category 2
- Sub Category 3
- Sub Category 4 We've newly introduced the Sub Category Home Pages and I was wondering if SEO is best performed on these pages or should landing pages be built, one for each of the 4 sub categories in each section. Those landing pages would have links to the "Searching / Results pages" for that sub category. Thanks!0