How to properly do reviews with Rich Text Snippets
-
I am trying to find out the best way to do this.
Do you use hreview?
Thanks,
-
Yes. You can find a helpful example of markup for multiple reviews at the bottom of the schema.org Review documentation.
Another resource that might be helpful is this SEOmoz post from last year: The Lowdown on Structured Data and Schema.org - Your Questions Answered!
-
If you had a testimonials page with 10 of them on there. Woudl you do them all in the schema.org markup?
-
hReview is just another way to markup a review. Specifically, it's the microformat way to markup a review. Here's the hReview specification: http://microformats.org/wiki/hreview
Google still supports microformat markup, but I recommend using schema.org markup because it is supported by Google, Bing, and Yahoo! (and it is the standard that those search engines are emphasizing for the future).
Once you've implemented the markup, use Google's Rich Snippet Tool to test if the markup is functioning correctly.
-
Thanks. I had used that code before and it didn't work. Maybe I had something wrong. Why is the difference with using HREVIEW?
-
Hi Dave,
Here's a great resource for generating rich snippets for reviews: Schema Creator for "Review"
That page includes the official Google video about rich snippets for reviews, helpful pointers to Google help pages, the schema.org documentation for the Review object, and a code generator for review markup.
I hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Description VS Rich Snippets
Hello everyone, I have one question: there is a way to tell Google to take the meta description for the search results instead of the rich snippets? I already read some posts here in moz, but no answer was found. In the post was said that if you have keywords in the meta google may take this information instead, but it's not like this as i have keywords in the meta tags. The fact is that, in this way, the descriptions are not compelling at all, as they were intended to be. If it's not worth for ranking, so why google does not allow at least to have it's own website descriptions in their search results? I undestand that spam issues may be an answer, but in this way it penalizes also not spammy websites that may convert more if with a much more compelling description than the snippets. What do you think? and there is any way to fix this problem? Thanks!
Technical SEO | | socialengaged
Eugenio0 -
What are we doing wrong with Rich Snippets?
So, a client webstore has rich snippets on all products, and it seems they are working fine, and are showing up, the only problem is, that the price for an article is not showing up. The part of the code, that shows the price is this: Redna cena:
Technical SEO | | Red_Orbit
29,99 € Is the problem that we have the itemprop="price" in a meta tag? I've read around the internet that if you have a lot of meta, this can be a problem.... Can we change it into: Redna cena: Would this work, or is there another thing that we can try? The URL for this article is http://www.bigbang.si/igre/sleeping-dogs-le-x360-4989980 -
Changing the anchor text of a big amount of links at once is bad for SEO?
Hi there, Our service at fotograf.de is a shopsystem for professional photographers. The customers can build their own website with our tool including an onlineshop to sell their pictures. We have a lot of links from our customers linking to our homepage. The links come from subdomains of our domain and from external domains. We are now thinking about changing the anchor text of half of the links (round about 300.000 links). Do we have to fear a penalization of Google for changing so many anchor texts at once? Do we get better rankings if we choose a more optimized anchor text or does this have no effect because most of the links are from subdomains (each customer has its own subdomain) of our domain? Thanks for answering! Sebastian
Technical SEO | | Sebastian230 -
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Technical SEO | | greyniumseo0 -
How do you balance site speed with rich media like videos?
Google says to make your pages as useful as possible, but it considers site speed into its ranking algorithm. How do you balance adding rich media like embedding useful YouTube videos with keeping your page load times low?
Technical SEO | | ProjectLabs1 -
Microdata / Rich snippet
How to change "votes" to "reviews" in the search result while implementing microdata / rich snippet?
Technical SEO | | gmk15670 -
Multiple Google Places listings under review
I have a client with a waste removal business who had multiple listings on his Google Places account for different service locations. Over the last four months I have been creating separate listings for each separate service location, each under a different Google account and with a unique business name, address, phone number and website URL. They have all been verified by postcard and listed separately in local directories so that they have citations. As I have been creating the new listings I have also been deleted the old ones to make sure they are not flagged as duplicates. 2 months ago all the listings in the client's Google Places account were placed under review. I made some changes and submitted it for re-review but no go with Google. Now all the new listings I set up have also suddenly been placed under review. About a week ago I noticed that information in two separate Google Places listings was being mixed up - for example, the website URL for one listing was being shown in another listing. There is no connection between these two listings other than that they were both set up from the same IP address so this seems very strange. I reported this to Google and asked them to sort it out, then all of a sudden I found that ALL of the new listings had been placed under review. So now my client has no active listings at all. He can't afford to wait another 2 months for Google to review all the listings again so I am wondering whether the best course of action would just be to delete everything and start over. Any advice would be most welcome!
Technical SEO | | EssexGirl0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0