Best practice for product detail when all products are onepage
-
HI there,
I have a page utilizing isotope with multiple products with small text excerpts and when you click an item i opens a detailed view without requiring a new page load. I've read some of the one page posts but can't get my head around what's best SEO wise when dealing with possible duplicate content.I guess one method could be to have the product list with small excerts of text and all the details hidden in some json and then when the user clicks it, it will open the product and fill with details from inline json. The click action is overring the a tag action e.g. with jquery, so the the a tag has a clean url to a proper subpage with meta, h1 and all that stuff so google can follow it. The jquery thing enables the navigation without a page reload and I can update the document url with pushState.
The subpage, if visited directly, includes the same animation stuff as the master but now has h1, p meta specific to that product but still with same effect, navigation and layout as the master page.Does anybody know if there is a better way to do this with one page sites when wanting to seo optimize detailed contents?
-
Hello Carsten,
I am not a developer, but hopefully I can be of some assistance. If not, we'll leave the question open to see if anyone else can be more specific for you.
First, I came across a good list of eCommerce sites using Isotope: http://isotope.metafizzy.co/ . If you scroll down to "Isotope in use" you'll see examples from Anthropologie, Rimmel London, Lexus, Biodroid, etc... Perhaps looking at how they handle things will help. For example, it looks like Anthropologie uses unique identifying URL for each product that is its own .jps file (e.g. 4130265414412.jsp), or is meant to appear that way.
A lot of AJAX sites will use a hashbang (#!) to enable crawling of dynamically created content on the same URL each time. Here is more information about that.
The solution you provided above sounds good, but I'd want to see it in action first. Be sure to view the page as Googlebot, and to see how it looks in Google's Cache Preview in the SERPs to make sure they are treating it the way you want them to.
Good luck with this, and please let us know how it turns out so we can be more informed on the issue in the future.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What type of website is best for seo.
I need a new website for my health insurance business. What type is best for SEO? Many thanks
Web Design | | laurentjb0 -
Should a menu work when JS is disabled? Is that best practice
When I am working on a new website I usually spec that the main navigation should work whether JS is on, or off. I always assumed everyone did that - until today - spent a couple of hours analysing menus on websites and noticed many didn't function when JS was disabled - particularly the menus designed for mobile devices. Any thoughts on this from fellow Mozzers would be welcome.
Web Design | | McTaggart0 -
Is there a best practice for how to set up an Age Restriction prompt for a website (vape equiptment)
I am working on a website that sells vape equipment and supplies and is required to have an age restriction (birthday) prompt in order to enter the site. Is there a 'best practice' way to implement this?
Web Design | | RandolphMorris0 -
Website 'stolen', no contact details
Hi all, Wondering if anyone could help out here, good a very strange issue.... Went into Google Webmaster Tools and looked at the incoming links to a client's site (new client, only just gained access to WMT) and noticed 2563 links coming from a domain. Upon viewing said domain it is a 100% copy of the clients site, I mean 100%; the phone numbers, email address etc are still pointing to the client's site. Everything is the same, the pages, the navigation etc. When I click on a link on the copy site it loads the same pages but at their site, the internal linking points to the version of the pages on their site. It seems to be an ongoing thing because the last time the client updated their blog was last week and this is on the copy site. Obviously this cannot be helping with regard to seo. The client knows nothing about it so not come from them. The copy site is indexed in Google!!. The first thing to do is to contact these people and ask what they are doing. This is proving to be easier said than done, the contact details (as mentioned above) on the pages still point back to the client and the whois gives no details. What would be the first step to take here? Obviously there is the whole legal area about stolen content but that can wait until we have the site down and out of Google. Is there somewhere in Google to report things such as this? I will speak to client and if they are happy I will share both the domains in question, they know I am seeking alternative opinions Many thanks Carl
Web Design | | GrumpyCarl0 -
Keywords in the page url for best SEO
Hello all, I am working in the keywors structure of a web and I have the following doubt: If I want to target these keywords: great food madrid and my website is: http://www.madridlive.com I do not know if I should keep either: OPTION 1: page url: www.madridlive.com/great-food-madrid or OPTION 2: page url www.madridlive.com/great-food I do not know if the search engines "understands" madrid in "madridlive", therefore I can avoid the "madrid" keyword, dicarding option 1 and going for option 2. Additionally I avoid duplication of the madrid keyword that can be seen as redundancy and also have a shorter page url. Thank you very much and sorry for such a question but I am new in this SEO field...just the excellent SEOMOZ's SEO Guide for beginners! Best regards, Antonio
Web Design | | aalcocer20030 -
Ecommerce SEO - product sort order
Hi, I've been trying to find the answer to this in google but having no luck. In the current era, is it damaging to have products ordered randomly in an ecommerce website? Also, how long would you suggest is a good length of time to establish your natural rank? Ive launched and still work on several succesful ecommerce sites, but have recently launched a completely new venture - brand new url, brand new site and it has been live for around 5 weeks now, and although it is being found in search, it isnt doing as well as i'd like using the moz pro tools ive picked up some issues and have in the last few days tweaked page titles, added 'nofollow' to all my filters, added content etc, so I feel as though ive reset the clock. the site (it's an adult site by the way) is www.lovesauce.co.uk - would appreciate some feedback from the pro's
Web Design | | tom.dollar0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0 -
What's the best was to structure Product page information on my site?
Hi - I run a hobby related niche new / article / resource site (http://tinyurl.com/4eavaj4). One of the most critical components of the site is our product database. We don't actually sell anything directly - instead we monetize them by displaying relevant affiliate product feeds and price comparisons. However since the Panda update was implemented in February my traffic (particularly my long tail, product related traffic) has dropped off considerably. I had about a 20% drop in overall traffic, but have made up some of the ground in the past week. However I want to know once and for all how I should structure my product related information as I have a ton of great content that is ready to be published in this section but want to be sure I structure it the best possible way from a SEO standpoint. Here are a few different options I've come up with for displaying information about products on my site. For the purpose of these examples I am going to refer to all of the information that makes up my product pages collectively as "product profiles". Please let me know which is the best SEO wise (or if you have a better way of doing it let me know): - Option 1 - Current Method - Divide Content Sections into different pages / urls Example: http://tinyurl.com/4tpdlbl This is how the majority of my product profiles are currently structured. I did this to improve load times and to keep the total number of links per page down. In addition to the core product profile subpages: "Product Details","Compare Prices", **"**Product Review", "Hot Auctions", and "Checklists", I have the Checklists area further segmented by subset, each of which is on its own page that is only accessible through the main Checklists tab of the profile. - Option 2 - Everything on one url / page the old fashioned way, with everything available by scrolling vertically. This would make the page go on forever though. - Option 3 - Everything on one url / page, but visually segmented using css / javascript tabs. Example: http://tinyurl.com/4kqhauh I looked at the source code and all the page text is there, so it looks like it would be spider-able but you tell me. Or would another method of tabbing be better? My site is wordpress based so the functionality comes from a plugin. - Option 4 - Use post tabs that are technically all on the same page, but make each individual tab be accessible through its own suburl, all of which share the same core canonical url. Example: http://tinyurl.com/4bs9pjs Clicking on any of the individual tabs will result in something like ?postTabs=2 being appended to the core url. Example: http://tinyurl.com/4gvgufc Any input would be greatly appreciated asap! Thanks Mike
Web Design | | MikeATL0