Penalty for Mixing Microdata with Metadata
-
The folks that built our website have insisted on including microdata and metadata on our pages.
What we end up with is something that looks like this in the header:
itemprop="description" content="Come buy your shoes from us, we've got great shoes.">
Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other.
Can anyone provide insight on this?
-
Worth noting that meta desc isn't one of those 3 markup styles. it is a different thing completely so you aren't actually mixing schema in your example.
-
Thanks for sharing that link. That post is very informative.
-
Thanks for answering so quickly.
When I said "bad thing" I meant that I don't see how such redundancy could ever be beneficial.
Thank you for your thoughts.
-
I would read this post for more information: http://www.seomoz.org/blog/schema-examples
The post discusses how Google used to support 3 different styles of Markup but with the creation of Schema.org, decided to only use that going forward. Any websites with existing markup would be okay though.
Google also mentioned (noted in the article above) that you should avoid mixing different types of markup formats on the same page as it can confuse their parsers.
-
Why do you think this would be a bad thing? I'd question how much benefit will be gained in most areas by doing this, but I can't see it causing harm and it is good to get in there now with this rather than adding it later (assuming you've backed the right format!).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
Review Microdata if the product does not have a review
We have many products that do not have reviews and are causing warnings in googles microdata. Is there anything you can do for products without a review so you don't get errors for aggregate review or review??
Algorithm Updates | | jforthofer0 -
Do we have any risk or penalty for double canonicals?
Hi all, We have double canonicals. From page A to page B to Page C. Will this be Okay for Google? Or definitely we need to make it A to C and B to C? Thanks
Algorithm Updates | | vtmoz0 -
Puzzling Penalty Question - Need Expert Help
I'm turning to the Moz Community because we're completely stumped. I actually work at a digital agency, our specialism being SEO. We've dealt with Google penalties before and have always found it fairly easy to identify the source the problem when someone comes to us with a sudden keyword/traffic drop. I'll briefly outline what we've experienced: We took on a client looking for SEO a few months ago. They had an OK site, with a small but high quality and natural link profile, but very little organic visibility. The client is an IT consultancy based in London, so there's a lot of competition for their keywords. All technical issues on the site were addressed, pages were carefully keyword targeted (obviously not in a spammy way) and on-site content, such as services pages, which were quite thin, were enriched with more user focused content. Interesting, shareable content was starting to be created and some basic outreach work had started. Things were starting to pick up. The site started showing and growing for some very relevant keywords in Google, a good range and at different levels (mostly sitting around page 3-4) depending on competition. Local keywords, particularly, were doing well, with a good number sitting on page 1-2. The keywords were starting to deliver a gentle stream of relevant traffic and user behaviour on-site looked good. Then, as of the 28th September 2015, it all went wrong. Our client's site virtually dropped from existence as far as Google was concerned. They literally lost all of their keywords. Our client even dropped hundreds of places for their own brand name. They also lost all rankings for super low competition, non-business terms they were ranking for. So, there's the problem. The keywords have not shown any sign of recovery at all yet and we're, understandably, panicking. The worst thing is that we can't identify what has caused this catastrophic drop. It looks like a Google penalty, but there's nothing we can find that would cause it. There are no messages or warnings in GWT. The link profile is small but high quality. When we started the content was a bit on the thin side, but this doesn't really look like a Panda penalty, and seems far too severe. The site is technically sound. There is no duplicate content issues or plaigarised content. The site is being indexed fine. Moz gives the site a spam score of 1 (our of 11 (i think that's right)). The site is on an ok server, which hasn't been blacklisted or anything. We've tried everything we can to identify a problem. And that's where you guys come in. Any ideas? Anyone seen anything similar around the same time? Unfortunately, we can't share our clients' site's name/URL, but feel free to ask any questions you want and we'll do our best to provide info.
Algorithm Updates | | MRSWebSolutions0 -
[G Penalty?] Significant Traffic Drop From All Sources
My client's traffic started to significantly decrease around Nov 21 (Panda update 22). This includes traffic from all sources - search engines (G, B, & Y!), direct, AND referral. At first we thought it was a G penalty but G answered our reconsideration request by stating that no manual penalty had occured. It could be algo penalty, but again, the site has been hit across all sources. Client has done zero backlinking - it is all natural. No Spam, etc.. All of his on-site SEO is perfect (700+ pages indexed, all unique content, unique title and desc). On Oct 16, he switched from his old URL to a new URL and did proper redirects. (Last year - Dec 2011 - he switched his CMS to Drupal and although there was a temporary decrease in traffic, it showed recovery within a month or so.) He does zero social on his site and he has many ads above the fold. Nevertheless, the traffic decrease is not source specific. In other words, all sources have decreased since Nov 21, 2012 and have not recovered. What is going on? What can be the explanation for decrease in traffic across all sources? This would be easy to answer if it was only Google Organic decrease but since direct and referral have also been hit, we cannot locate the problem. Please share your personal experiences as well as advice on where we should look. Could this be negative SEO? Where would we look? ANY ADVICE IS WELCOME !!!! Every bit counts Thanks!!
Algorithm Updates | | GreenPush0 -
Specific Page Penalty?
Having trouble to figure out why one of our pages is not ranking in SERPs, on-page optimisation looks decent to me. Checked by using gInfinity extension and searched for the page URL. Can one page be penalised from Google engines (.ie / .com ) and the rest of the website not penalised? The (possible) penalised page is showing in Google places in SERPs. I assume this would not show if it was penalised. Would appreciate any advice. Thanks
Algorithm Updates | | notnem0 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
Product microdata from Schema.org
An article (http://www.websitemagazine.com/content/blogs/posts/archive/2011/11/18/step-up-your-e-commerce-seo-game-with-product-microdata.aspx?utm_source=newsletter&utm_medium=email&utm_campaign=newsletter) is claiming that using this product micro data (http://schema.org/Product) might help product pages rank better. Do you have any experience using these tags and would it be worth the time to implement these on a site with 1000's of products? Would it make sense to selectively implement them on specific products that actually have a good chance of ranking high instead?
Algorithm Updates | | pbhatt0