Search Pages outranking Product Pages
-
A lot of the results seen in the search engines for our site are pages from our search results on our site, i.e. Widgets | Search Results
This has happened over time and wasn't intentional, but in many cases we see our search results pages appearing over our actual product pages in search, which isn't ideal.
Simply blocking indexing of these pages via robots wouldn't be ideal, at least all at once as we would have that period of time where those Search Results pages would be offline and our product pages would still be at the back of ranking.
Any ideas on a strategy to replace these Search Results with the actual products in a way that won't hurt us too bad during the transition? Or a way to make the actual product pages rank above the search results? Currently, it is often the opposite.
Thanks!
Craig
-
Thanks again for the answers!
Yeah, totally getting you on the Search within search issue. Wish we had known about that a couple of years ago. Did an analytics check and most of our non-home page traffic is coming from Search Results in serps. According to inurl, we have about 200,000 indexed SearchResult pages and based on some data I pulled up, they are our highest traffic non-home page pages, but also the least converting.
I think 301 re-directs on these would be rather tricky. I mean, if someone does a search on our site, they should get the search results page showing them several options, not be shot directly to a single product which might not be the one they need. It would be rather confusing for our regular customers as well.
But I agree we need to do something here, because conversely, our product pages, while getting the least traffic are the highest converters.
My only thought is that we would need to:
1. Find a list of all of the indexed Search Result pages, or at least the ones that have been hit over the last year or so. What would be the best way to do that? Screaming Frog? Analytics?
2. Create a script that analyzes these for the keywords used in them and find a suitable item to re-direct to based on the keyword extracted.
3. 301 re-direct them.
4. Change our current search results urls to include something that would not be included in these original pages so separate them from the old pages that are now being re-directed so that current searchers don't get re-directed as well.
5. Set the search results pages to no - index. Is that the best way to handle that? If we did robots.txt, then we would be breaking the link flow of the site wouldn't we? Don't we need the bots to crawl the search pages to lead to the product pages, or is the sitemap all that is needed?
Thanks for the time and answers!
Craig
-
Hello Craig,
I've dealt with this issue on several client sites and typically opt for noindexing the search pages (sometimes even blocking in Robots.txt) as recommended by others here - especially if you can't make any of them static.
In terms of the product pages, it could be helpful to the visitor if they search for "Specific Product A" for you to just go ahead and land them on the "Specific Product A" page, either via a 301 redirect from the search result page, or by serving up the product page in the first place. This would take care of usability as well as your issue with search engines.
I would not gradually implement something here, as that could be even more confusing to search engines. Do you want the search pages indexed or not?
What I have seen is a temporary blip in traffic (a few weeks at most) followed by improvement in traffic due to an improvement in product page rankings as a result. Every situation is different though, and this assumes good implementation.
Looking at this from Google's perspective, understand that they ARE the search engine so why would they want to send the user to yet another set of search results? Google should know which page on your site to send visitors. They don't need an intermediary, which is why their guidelines say this:
"Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines."
Good luck and let us know how it turns out!
-
Oh, and also, just to clarify.... Are you saying what we should do is 301:
http://oursite.com/SearchResult.html?Text=Monkeys+Ate+Soul
to, let's say
http://oursite.com/ProductPage.html?Title=TheMonkeysAteMySoul
That would be ok?
Thanks!
-
Thanks Jesse. Sounds like a big undertaking, but something we need to move on. Question... How accurate is "site:yoururl.com inurl:search"? I just did a test on it and the number of results that came back is way lower than what it should have been based on how our sitemaps are shown to be indexed in webmastertools.
Thanks for taking the time to answer!
Craig
-
I would be careful about allowing search pages to continually index. You will most likely end up with hundreds if not thousands of low value pages that may cause you to fall into a Panda algo penalty. Simply do a site:yoururl.com inurl:search (or whatever parameter you use ) to see how many pages you have indexed for search results.
You could find the page search pages that are out ranking your product page and 301 them if the traffic is substantial. Otherwise, I would say that by noindexing the search pages, you should reduce the competition for those product pages and they should start to rank and hopefully convert better.
I've had to do the the same for several sites because of a panda penalty so I can't speculate on traffic levels.
-
Thanks Zora! Yeah, these are all going to be dynamic unfortunately, and there are a lot of them. In the hundreds of thousands. So, we would need some type of transition strategy. I would be concerned that a one time no-index all at once would be quite problematic.
Just curious if anyone else had to transition in this way and was able to do so successfully.
Thanks for the feedback!!
Craig
-
We had the same problem, but decided to embrace it.
I started optimizing and adding content to a few of the search results pages (and made them static, not dynamic) and now they rank fairly well.
However, for dynamic search pages I suggest you noindex them.
Google recommends it, and it's best to follow their recommendations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you do on-page optimization for a page with rel=canonical tag?
If you ad a rel=canonical tag to a page, should you still optimize that page? I'm talking meta description, page title, etc.
On-Page Optimization | | marynau0 -
Changing a page url
I have a page that ranks well (#4) for a good keyword. However, the url has the keyword in it but is misspelled. I would like to change the url to have the correct spelling but do not want to lose the ranking that I have. What is the best and safest way to proceed?
On-Page Optimization | | bhsiao0 -
Is reported duplication on the pages or their canonical pages?
There are several sections getting flagged for duplication on one of our sites: http://mysite.com/section-1/?something=X&confirmed=true
On-Page Optimization | | Safelincs
http://mysite.com/section-2/?something=X&confirmed=true
http://mysite.com/section-3/?something=X&confirmed=true Each of the above are showing as having duplicates of the other sections. Indeed, these pages are exactly the same (it's just an SMS confirmation page you enter your code in), however, they all have canonical links back to the section (without the query string), i.e. section-1, section-2 and section-3 respectively. These three sections have unique content and aren't flagged up for duplications themselves, so my questions are: Are the pages with the query strings the duplicates, and if so why are the canonical links being ignored? or Are the canonical pages without the query strings the duplicates, and if so why don't they appear as URLs in their own right in the duplicate content report? I am guessing it's the former, but I can't figure out why it would ignore the canonical links. Any ideas? Thanks0 -
Product page optimalisation
Throughout the years our website kept on growing this has led to product pages that have so much sub-pages that nobody is able to really get a good idea of the product. We are working on a new website where the visitor is central. Together with a usability partner we have down sized the preferred data to fit on one page with a tabular system with a maximum of 4 taps. My question will this affect our find ability if we go from 10 to 15 sub-pages to one main page
On-Page Optimization | | TiasNimbas0 -
Duplicate page content,
Hi, in my campaign crawls diagnostic, I have a lot of Duplicate page content, but we use canonicalization and I used webmastertool to make sure the campaign parameters are not consider by the Google bot. Can you see what could be my problem, or do you have a tip for me or things to look at ? Thank You VB
On-Page Optimization | | Vale70 -
Optimally, how many times should the key word or phrase you are targeting for a particular page be mentioned or appear on that page?
Our marketing team is debating how many times the key phrase on each of our web store's product pages should include the word/phrase we are trying to be competitive with. Can you advise?
On-Page Optimization | | Glynlyon0 -
Should one page with markers or six separate pages?
Hi - I'm working on a site that was set up with 6 bios on one page, with markers jumping to each person's name. I was thinking about separating those into 6 different pages, but not sure if that's the right thing to do. Advice about keeping the bios on one page vs splitting them up? (Am I more likely to rank for those peoples' names if I have a unique page, or is the one page url with each different marker in it, just as good?) Ranking well for those names isn't a huge goal of the site, but it would be nice to make the choice that would help with that rank. Thanks for your input Emma
On-Page Optimization | | emmas0 -
Spammy keywords on a page
My client's website has a box of text on each page that is spammy and horrible to read and stuffed with keywords. The text boxes are there only for search engines as they mean nothing to humans. I say remove them as it must be doing more harm than good. However, my client is scared to remove them as the text has been there on each page for ten years and he is worried about a drop in visitor numbers if they are removed. Is he right to be worried?
On-Page Optimization | | mascotmike0