Ok to use rich snippets for same product on multiple pages?
-
I am developing a new set of pages for a series of products which exist on separate sub domains linked to the root domain. The product pages on the sub domains have rich snippets; review count, review score etc. The new pages im building out are for the same products though on the root domain and with different content. Im not comfortable marking those pages up with rich snippets too given they will have the same review counts, scores etc though would like to if its viable? Any thoughts/opinions?
Thanks,
Andy
-
Why don't you move everything to the root domain?
You can keep all the subdomains in existence and put in 301 redirects from the sub-domain to the root domain and then any links that may be coming in you just ask to get them changed?
Then the search feature also is tweaked to only look at root domain?
This would simplify the site a lot and make it much easier to manage in the long term.
Just my opinion, but hope it helps.
P.S. Spend the money wisely, we're in a recession don't you know?
-
Hi Thanks for the reply.
The reason for the new pages for the old products is 2 fold really. 1) the sub domains dont rank as well as they could simply because they are sub domains, they get little link juice externally or internally and there is internal opposition (fear) to moving them to the root domain. The root domain is high authority, ranks well and so showcasing the products on the root domain is a great option. 2) People can use the product and use different search terms to find the product than is currently represented by the products on the sub domain e.g. a hotel in london home page can be optimised for hotel/accommodation keywords, it would also be useful to be found for 'short breaks in london' so you may chose to create a short breaks page. Same product, 2 different approaches and content required. (not actual example).
Ill take the 2p and spend it, im in the UK also.
-
My first question would be why are you showcasing the same products on different pages? Do you have 'featured products' or 'most popular' products or 'special offers', for example or is it something else.
After considering that, assuming there's a logical user-driven reason for doing it - I can't see a major problem if the rich snippets are consistent for each product.
Of course, depending on whether this is manual or automatic/CMS driven it will become a headache to keep product snippets updated in multiple locations so on that front I wouldn't encourage it. Get a balance though - if you can encourage click through from the root to the sub-domain then the snippets will be picked up anyway. Don't make running your site more hard work than it strictly necessary.
Just my £0.02p (I'm in the UK so I can't give you 2 cents ;))
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Index No follow instead of Rel canoncical on product pages
Hi all, we handle our product pages no with rel canonical now, we have 1 url that is indexed http://www.prams.net/cam-combi-family the other colours have different urls like http://www.prams.net/cam-combi-family-3-in-1-pram-reversible-seat-car-seat-grey-d which canonicalize to the indexed page. Google still crawls all those pages. For crawl budget reasons we want to use "no index, no follow" instead on these pages (the pages for the other colours)? Google would then crawl fewer pages more often? Does this make sense? Are their any downsides doing it? Thanks in advance Dieter
Intermediate & Advanced SEO | | Storesco1 -
Prioritise a page in Google/why is a well-optimised page not ranking
Hello I'm new to Moz Forums and was wondering if anyone out there could help with a query. My client has an ecommerce site selling a range of pet products, most of which have multiple items in the range for difference size animals i.e. [Product name] for small dog
Intermediate & Advanced SEO | | LauraSorrelle
[Product name] for medium dog
[Product name] for large dog
[Product name] for extra large dog I've got some really great rankings (top 3) for many keyword searches such as
'[product name] for dogs'
'[product name]' But these rankings are for individual product pages, meaning the user is taken to a small dog product page when they might have a large dog or visa versa. I felt it would be better for the users (and for conversions and bounce rates), if there was a group page which showed all products in the range which I could target keywords '[product name]', '[product name] for dogs'. The page would link through the the individual product pages. I created some group pages in autumn last year to trial this and, although they are well-optimised (score of 98 on Moz's optimisation tool), they are not ranking well. They are indexed, but way down the SERPs. The same group page format has been used for the PPC campaign and the difference to the retention/conversion of visitors is significant. Why are my group pages not ranking? Is it because my client's site already has good rankings for the target term and Google does not want to show another page of the site and muddy results?
Is there a way to prioritise the group page in Google's eyes? Or bring it to Google's attention? Any suggestions/advice welcome. Thanks in advance Laura0 -
URLs: Removing duplicate pages using anchor?
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Any experience with using programs to create UGC pages?
We have a new client (a mobile app) who created a program to create thousands of pages of "unique, user-generated content" for their website. An example: A person in the forum in app asks a question, and people respond. The client's program then compiles the question and responses into a unique, auto-generated page for the website. (I don't think the app is utilizing deep linking -- though I was going to recommend it -- so the app content is not indexed by search engines yet.) The pages are already created -- they are just not live on the site yet. I'm very skeptical. But the client says it's similar to what Stack Overflow does (or something like that). Basic example. Say that a question for which the client wants to rank is, "What Are the Symptoms of Cancer?" I'd think that a quality, human-created, referenced, well-written, authoritative page would obviously rank more highly than a UGC page based on a forum discussion on that topic. But of course, doing that for hundreds of questions is costly and hard to scale -- both of which are concerns of the client (a startup with little money). Has anyone had any experience in this? It's the first time I've tackled such an issue. Thanks in advance for any thoughts!
Intermediate & Advanced SEO | | SamuelScott0 -
301 redirect for page 2, page 3 etc of an article or feed
Hey guys, We're looking to move a blog feed we have to a new static URL page. We are using 301 redirects but I'm unsure of what to regarding page 2, page 3 etc. of the feed. How do I make sure those urls are being redirected as well? For example: Moving FloridaDentist.com/blog/dental-tips/ to a new page url FloridaDentist.com/dental-tips. So, we are using a 301 on that old url to the new one. My questions is what to do with the other pages like FloridaDentist.com/blog/dental-tips/page/3. How do we make sure that page is also 301'd to the new main url?
Intermediate & Advanced SEO | | RickyShockley0 -
Using Canonical on home page
Our home page has the canonical tag pointing to itself (something from wordpress i understand). Is there any positive or negative affect that anyone is aware of from having pages canonical'ed to themselves?
Intermediate & Advanced SEO | | halloranc0 -
Better UX or more Dedicated Pages (and page views)?
Hi, I'm building a new e-commerce site and I'm conflicting about what to do in my category pages. If we take for example a computer store.
Intermediate & Advanced SEO | | BeytzNet
I have a category of laptops and inside there are filters by brand (Samsung, HP, etc.). I have two options - either having the brand choice open a new dedicated page -
i.e. Samsung-Laptops.aspx or simply do a JQuery filter which gives a better and faster user experience (immediate, animated and with no refresh). **Which should I use? (or does it depend on the keyword it might target)? **
Samsung laptops / dell laptops / hp laptops - are a great keyword on there own! By the way, splitting Laptops.aspx to many sub category physical pages might also help by providing the site with many actual pages dealing with laptops altogether.0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0