Help! Optimizing dynamic internal search results pages...
-
Hi guys,
Now I have always been against this, and opted to noindex internal search results pages to stop the waste of link juice, dupe content, and crawl loops... however, I'm in a discussion with somebody who feels there may be a solution, and that the pages could actually be optimized to rank (for different keywords to the landing pages of course).
Anybody come across such a thing before?
My only solution would be still to noindex and then build static pages with the most popular search results in but that won't suffice in this case.
Any recommendations would be much appreciated
Thanks,
Steve
-
Hi Steve,
Sorry for responding so late.
So whenever we have a situation as having many dynamic pages optimized, I always get back to the idea of Faceted Navigation. This way you will give the Search Engines one path to crawl to each products (but be aware that you will need to invest time and energy to have a well planned, structured navigation path).
What's the positive behind Faceted navigation:
- it will resolve the duplicate page flag on your products (giving search engines only one good specified path to crawl)
- it will raise the user experience as the visitors will have the chance to easily get to the desired product fast
The negatives:
- Are there any?
- Ok, seriously speaking, you will need to sit down and think a lot of which path you choose, how to resolve it to be efficient, the web dev team will work a few days/weeks (depending on their other tasks, importance of their tasks, etc.)
What it will bring to your site (relative to what you have now): currently you have noindex on the "category" (basically a search page with a filter on it, right?) page. With faceted navigation this current structure will be completed with an "tree" structure to your final products on the specified path (that you have chosen).
I hope this is clear and helped!
Gr.
Istvan
-
Thanks Istvan,
What are the solutions you're thinking of if you were to go against best practice?
-
Hey Steve,
Yes there are possibilities for optimizing the dynamic pages, but I am on the same side with you, I would simply noindex the search result pages and focus my energy on other possibilities on the site.
Gr.,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Sitelinks in non-brand based organic search results
Hi all, I have a question for everyone. Sitelinks have been around for a while now & I've always seen them when the search is for a brand's name. However, today, when looking at the rankings for one of the campaigns we manage, we noticed there were sitelinks in the number #1 & #2 positions in Google (Australia) for the search term "Dance Costumes". Whilst both the companies have Dance Costumes in their title, so do all the other results & so I don't see why it warrants the sites to be relevant via their brand name.
Intermediate & Advanced SEO | | KBB_Digital
Note: The results are organic results, not paid results (where you can add sitelinks). Firstly, has anyone seen this before (screenshot attached)?
And secondly, is there markup/schema that allows you to do this (none that I know of)? danceCostumes-sitelinks.png0 -
Optimized pages ranking lower than homepage with keywords
Ok, I know this question has been out there before, but i don't know how fo search for it specifically enough. I have several keywords that rank higher on my home page. As you know MOZ assigns keywords to whichever page on your site popping up in search first. So even though i have A-grade optimized pages for a particular keyword, that page may not pop up BEFORE the homepage for instance, on searches. In many cases, the homepage is grade "F" for a particular keyword, yet its pulling up first for most of my keywords. I know that my homepage has more rank because it gets the most visits and i'm sure we can't really optimize the homepage for EVERY keyword. What is the best thing to do in this situation? Do i just need to wait for my optimized page to catch up in rank, or is there a trick to optimizing homepage to ALL key words at grade "A" level? Do i need to keep back-linking to my optimized page directly to get the juice up? I created all these great optimized pages for specific keywords, but my homepage which shows "F" grade is the one pulling up 4th or 5th on searches Help??
Intermediate & Advanced SEO | | DrMcCoy0 -
Indexing of internal search results: canonicalization or noindex?
Hi Mozzers, First time poster here, enjoying the site and the tools very much. I'm doing SEO for a fairly big ecommerce brand and an issue regarding internal search results has come up. www.example.com/electronics/iphone/5s/ gives an overview of the the model-specific listings. For certain models there are also color listings, but these are not incorporated in the URL structure. Here's what Rand has to say in Inbound Marketing & SEO: Insights From The Moz Blog Search filters are used to narrow an internal search—it could be price, color, features, etc.
Intermediate & Advanced SEO | | ClassicDriver
Filters are very common on e-commerce sites that sell a wide variety of products. Search filter
URLs look a lot like search sorts, in many cases:
www.example.com/search.php?category=laptop
www.example.com/search.php?category=laptop?price=1000
The solution here is similar to the preceding one—don’t index the filters. As long as Google
has a clear path to products, indexing every variant usually causes more harm than good. I believe using a noindex tag is meant here. Let's say you want to point users to an overview of listings for black 5s iphones. The URL is an internal search filter which looks as follows: www.example.com/electronics/apple/iphone/5s?search=black Which you wish to link with the anchor text "black iphone 5s". Correct me if I'm wrong, but if you no-index the black 5s search filters, you lose the equity passed through the link. Whereas if you canonicalize /electronics/apple/iphone/5s you would still leverage the link juice and help you rank for "black iphone 5s". Doesn't it then make more sense to use canonicalization?0 -
Add noindex,nofollow prior to removing pages resulting in 404's
We're working with another site that unfortunately due to how their website has been programmed creates a bit of a mess. Whenever an employee removes a page from their site through their homegrown 'content management system', rather than 301'ing to another location on their site, the page is deleted and results in a 404. The interim question until they implement a better solution in managing their website is: Should they first add noindex,nofollow to the pages that are scheduled to be removed. Then once they are removed, they become 404's? Of note, it is possible that some of these pages will be used again in the future, and I would imagine they could submit them to Google through Webmaster Tools and adding the pages to their sitemap.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
What can I do to put these pages back in the top results?
Hello here, here is an interesting question for you. The following 2 webpages from our website have been ranking well on Google (usually on the 1st or 2nd page) for the past 12 years. They are among our oldest, highly relevant product pages on our site: http://www.virtualsheetmusic.com/score/Moonlight.html http://www.virtualsheetmusic.com/score/Eliza.html And we could always find them with the keyword "moonlight sonata sheet music" or "fur elise sheet music". Now, since the last November these pages don't show up anymore despite they are still present in the index. It is pretty hard to understand why those pages don't show up in the search results for those keywords as they used to, above all if you consider that those are among our best, most popular and unique product pages! But instead to struggle to understand why we lost presence (Panda? Some unknown sort of penalization?), has anyone any suggestions to help us to have those pages back in the top results? What do you suggest to do in such kind of cases? Any ideas and thoughts are very welcome! Thank you in advance.
Intermediate & Advanced SEO | | fablau0 -
Handful of internal pages penguin penalized. 302 them or let them 404?
We have a site that is for the most part doing great, but the internal pages that received too much link building received some penguin penalties (no warning in WMT) but it's fairly obvious. Has anyone tried letting these pages 404 and just creating new URL's? Or 302 redirecting the old URL's to new ones?
Intermediate & Advanced SEO | | iAnalyst.com0