Wrapping my head around an e-commerce anchor filter issue, need help
-
I am having a hard time understanding how Google will deal with this scenario, I would love to hear what you guys think or suggest.
Ok a category page on the site in question looks like this. http://makeupaddict.me/6-skin-care All fine and well, But a paginated page or a filtered category pages look like these http://makeupaddict.me/6-skin-care#/page-2 and http://makeupaddict.me/6-skin-care#/price-391-1217
From my understanding Google does not index an anchor without a shebang (#!), but that doesn't mean that they do not still crawl them, correct? That is where the issue comes in, since anchors are not indexed and dropped from the urls, when Google crawls a filtered or paginated page, it is getting different results. From the best of my understanding, and someone can correct me if I am wrong but an anchor is not passed in web languages like a querystring is. So if I am using php and land on http://makeupaddict.me/6-skin-care or http://makeupaddict.me/6-skin-care#/price-391-1217 and use something like .$_SERVER['SELF'] to get the url both pages will return http://makeupaddict.me/6-skin-care since the anchor is handled client side.
With that being the case, is it imagined that Google uses that standard or is it thought they have a custom function that grabs the whole url anchor in all? Also if they are crawling the page with the anchor, but seeing it anchor less how are they handling the changing content?
-
If you are worried that Google follows filter links, you can rel=nofollow those links and include a rel=canonical tag. See this article on faceted navigation: http://googlewebmastercentral.blogspot.ca/2014/02/faceted-navigation-best-and-5-of-worst.html
My understand is that http://makeupaddict.me/6-skin-care#/price-391-1217 will be seen and interpreted as http://makeupaddict.me/6-skin-care. Filtered pages should be seen and interpreted as their unfiltered pages.
This being said, I would compare how both pages looks like in Webmaster Tools using the Fetch as Googlebot tool. This will tell you how it sees the filtered page.
Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from Sitemap Issues with Bing
Hi all, I recently took over SEO efforts for a large e-commerce site (I would prefer not to disclose). About a month ago, I began to notice a significant drop in traffic from Bing and uncovered in Bing Webmaster Tools that three different versions of the sitemap were submitted and Bing was crawling all three. I removed the two out of date sitemaps and re-submitted the up to date version. Since then, I have yet to see Bing traffic rebound and the amount of pages indexed by Bing is still dropping daily. During this time there has been no issue with traffic from Google. Currently I have 1.3 million pages indexed by Google while Bing has dropped to 715K (it was at 755K last week and was on par with Google several months ago). I know that no major changes have been made to the site in the past year so I can't point to anything other than the sitemap issue to explain this. If this is indeed the only issue, how long should I expect to wait for Bing to re-index the pages? In the interim I have been manually submitting important pages that aren't currently in the index. Any insights or suggestions would be very much appreciated!
Technical SEO | | tdawson090 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
I have 2 E-commerce sites - Can i cross link?
Good Morning Everyone, I have 2 e-commerce websites that are similar and sell the same products. The content (text/descriptions/titles) is different so the content is not duplicate. SITE A has a ton of blog posts with highly relevant information and we frequently update the blog with posts about the types of products we carry and how it can help people in their daily lives... SITE B has no blog posts, but the content on the blog from SITE A is extremely relevant and helpful to anyone using SITE B. My question is, do you think it is frowned upon if i were to add links on SITE B that point to specific posts on SITE A... For example, if you are browsing a category page on SITE B, i was thinking of adding links on the bottom that would say "For More Information, Please Check Out These Posts on our Blog" www.sitea.com/blog/relevantinfo1 www.sitea.com/blog/relevantinfo2 www.sitea.com/blog/relevantinfo3 I think this would seriously help our browsers and potential customers get all of the information that they need, but what do you think Google would think about this cross-linking and if it violates their guidelines? Thanks for any opinions and advice.
Technical SEO | | Prime850 -
What rich snippet etc do I need for this
I attach a screenshot after searching "Mashable". Would appreciate a pointer in the right direction to add this for my site. 71VZEJX
Technical SEO | | Jonathan19790 -
Help with site structure needed - any assistance welcomed!
Hi all, I am currently tasked with finding a better way to optimise our website ukdocumentstorage dot com. For starters, I would like to know what our site structure actually is at present. So I would like to be able to see which pages are linking to what at the moment & which pages have broken links on which I need to remove from the content. Hopefully I'd then be able to tidy up any errors that the site already has in its internal linking. Is there a way to do this easily? Or to have a graphical representation of the sites structure? I have just signed into our Webmaster Tools account and I am faced with a list of 10 'Crawl Errors' which are all 404 errors. Some of them do not actually exist anymore, but are still being linked to from a few pages according to WMT. For example, /industries_served_legal.htm is still being linked to from 5 of our pages (including /industries_served_local_authority.htm) However, this doesn't seem to be a case at all on the page as I can't find a link to /industries_served_legal.htm on /industries_served_local_authority.htm. Any advice as to why this is happening? Is there a way to find out easily where these broken links are situated on the page? And if I do actually manage to find our broken links, how would I go about removing them? The page /document_security.htm doesn't exist in our Sitewizard list of pages anymore, yet still exists online. How do I go about deleting this unecessary page properly? And does this harm our rankings? The document_security page also has an extra link on the top toolbar to a Document Management page, an addition which is no longer present on our up to date pages. Now this page (and the extra dropdown page when you hover over it) still exist on our list of Sitewizard pages at the moment, but we obviously no longer want to have these online anymore. How should I remove these? I understand that this is a lot of information, and so I would appreciate any help that can be given on these! Many thanks
Technical SEO | | janc0 -
Filter Tag Duplicate Content E-Commerce Issue
Hello, I just launched a new site for a client but am seeing some duplicate content issues in the campaign crawl. It has to do with the drill-down, filter "tags" that helps users find the product they are looking for. You can see them in the sidebar here: http://www.ssmd.com/shop/ In my crawl report this is what is showing up as duplicate content (attached image). How do I keep these widgets from generating duplicate content on the site? Also, not sure if it's important or not, but I am using Wordpress, WooCommerce and Yoast's SEO Tool. Any suggestions are appreciated! Screen%20Shot%202012-10-23%20at%202.56.00%20PM.png
Technical SEO | | kylehungate0 -
Duplicate content issues, I am running into challenges and am looking for suggestions for solutions. Please help.
So I have a number of pages on my real estate site that display the same listings, even when parsed down by specific features and don't want these to come across as duplicate content pages. Here are a few examples: http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html?feature=waterfront http://luxuryhomehunt.com/homes-for-sale/lake-mary/hanover-woods.html This happens to be a waterfront community so all the homes are located along the waterfront. I can use a canonical tag, but I not every community is like this and I want the parsed down feature pages to get index. Here is another example that is a little different: http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=without-pool http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=4-bedrooms http://luxuryhomehunt.com/homes-for-sale/winter-park/bear-gully-bay.html?feature=waterfront So all the listings in this community happen to have 4 bedrooms, no pool, and are waterfront. Meaning that they display for each of the parsed down categories. I can possible set something that if the listings = same then use canonical of main page url, but in the next case its not so simple. So in this next neighborhood there are 48 total listings as seen at: http://luxuryhomehunt.com/homes-for-sale/windermere/isleworth.html and being that it is a higher end neighborhood, 47 of the 48 listings are considered "traditional listings" and while it is not exactly all of them it is 99%. Any recommendations is appreciated greatly.
Technical SEO | | Jdubin0 -
Do I need an XML sitemap?
I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so?
Technical SEO | | pugh0