Has my Rich Snuppet attempt passed the test?
-
Good Morning from 16 degrees C still sunny Wetherby UK....
For the first time ive dived into Microformat,s schema.org Microdata call it what you will "Rich snippets"
On this http://www.barrettsteel.com/ site on the bottom left I tweeked the address into a rich nippet, here is what i did:
I then diligently tried to find out if it was valid by running it through http://www.google.com/webmasters/tools/richsnippets but I'm not 100% clear if its passed
So my question is please can anyone verify if the snippet data is valid.
Thanks in advance,
David
-
Thanks Martin
-
Lookms ok to me also, Martin makes a good point of keep consistant across the net
-
This is looking okay to me - with the only issue being that the +44 and (0) are outside of your Telephone Itemprop. You want to include one or both inside this.
What you're trying to achieve is a good rich snippet and consistency with other places your details are shown - e.g. Google Maps, local directories etc. And the area code is part of your local identity, so in the very least, put the zero inside the telephone itemprop.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Inspector, Rich Results Tool, GSC unable to detect Logo inside Embedded schema
I work on a news site and we updated our Schema set up last week. Since then, valid Logo items are dropping like flies in Search Console. Both URL inspector & Rich Results test cannot seem to be able to detect Logo on articles. Is this a bug or can Googlebot really not see schema nested within other schema?Previously, we had both Organization and Article schema, separately, on all article pages (with Organization repeated inside publisher attribute). We removed the separate Organization, and now just have Article with Organization inside the publisher attribute. Code is valid in Structured Data testing tool but URL inspection etc. cannot detect it. Example: https://bit.ly/2TY9Bct Here is this page in URL inspector: By comparison, we also have Organization schema (un-nested) on our homepage. Interestingly enough, the tools can detect that no problem. That's leading me to believe that either nested schema is unreadable by Googlebot OR that this is not an accurate representation of Googlebot and it's only unreadable by the testing tools. Here is the homepage in URL inspector: In pseudo-code, our OLD schema looked like this: The NEW schema set up has the same Article schema set up, but the separate script for Organization has been removed. We made the change to embed our schema for a couple reasons: first, because Google's best practices say that if multiple schemas are used, Google will choose the best one so it's better to just have one script; second, Google's codelabs tutorial for schema uses a nested structure to indicate hierarchy of relevancy to the page. My question is, does nesting schemas like this make it impossible for Googlebot to detect a schema type that's 2 or more levels deep? Or is this just a bug with the testing tools?
Technical SEO | | ValnetInc0 -
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Migrating Customers Off Domain, DO NOT Want to Pass Ranking Signals
Hey everyone! Thanks in advance for any help on this. I work for a SaaS company that has all of our customer apps and assets on our company domain. This has resulted in a lot of backlinks pointing to our domain, and a lot of pages indexed as well. I'm working with product to migrate all customers onto a separate domain, but a concern is that we need to still move the customer content to the new domains somehow without passing any of this backlink info. Am I correct in my assumption that if we 301 all of the apps and assets, all of that backlink info stays the same? What would be the best way to do this? Could we 302 everything and then wait like, 30 days and delete the 302? Would that still fix the problem, or does all of that backlink data "stick" after the 302 is deleted? Any additional thoughts would be extremely helpful!!
Technical SEO | | rachelmeyer0 -
A/B testing entire website VS Seo issues
I'm familar with A/B testing variations of a page but I'd like to A/B test a new designs version of a e-commerce site. I´m wondering about the best way to test with SEO concerns... this is what I´ve in mind right now, any suggestion? Use parameters to make version B different from A version. Redirect 50% of the users with 302 ( or javascript would be a better way?) Use noindex in the B pages. Use rel=canonical in the B pages pointing to A version. In the end use 301 redirect to all B pages to A urls. PS: We can´t use subdomain and i don´t wanna use robots.txt file to protect the new design from competitors. I´d love any suggestions and tips about it - thanks folks 🙂
Technical SEO | | SeoMartin10 -
Redirect Without Passing Old Page Properties
Is there a way to redirect one page to another, e.g. test.com/ to test.com/home, without passing link juice or any other associated properties of the latter to the former?
Technical SEO | | NTGproducts0 -
Duplicate content due to credit card testing
I recently launched a site - http://www.footballtriviaquestions.co.uk and the site uses Paypal. In order to test the PayPal functionality I set up a zapto.org domain via a permanent IP service that points directly to the computer I've written the website on. It appears that Google has now indexed the zapto.org website. Will this cause problems to my main website, as the zapto.org website will pretty much contain content that is an exact duplicate of what is held on the main website. I've looked in Google webmaster tools for the main website and it doesn't mention any duplicate content, but I'm currently not in the top 50 ranking for "football trivia questions' on Google despite SEOMoz ranking my home page with an A rating. The page does rank at position 16 in Yahoo and Bing. This seems odd to me, although I do have very few back links pointing to my site. If the duplicate content is likely to be causing me problems what would be the best way to knock the zapto.org results out of Google
Technical SEO | | ipr1010 -
With rich snippets is it bad to add a city to the company name?
I have a client that is a franchise. Each franchise location has a different office address. Is it bad for me to do the following? COMPANY NAME of CITY ... .... There are about 10 franchisees. Should I use just the company name? Is the city in there going to be a negative?
Technical SEO | | thomas.wittine0