Should I delete 'data hightlighter' mark-up in webmaster tools after added schema.org mark-up?
-
LEDSupply.com is my site, and before becoming familiar with schema mark-up I used the 'data-highlighter' in webmaster tools to mark-up as much of the site as I could. Now that Schema is set-up I'm wondering if having both active is bad and am thinking I should delete the previous work with the 'data highlighter' tool.
To delete or not to delete? Thank you!
-
Ah, ok. My mistake, I didn't drill down enough. One thing I did notice: you have authorship markup on those product pages as well. That should be removed.
According to Google's guidelines, for product pages that are not specifically written/constructed by an "author," that markup should not be there. Rel="publisher" is the only necessary markup for non-blog or article content.
The schema markup you've implemented looks good in the page source, and checks out as being correctly implemented (without any duplicates) using Google's Structured Data Testing Tool (found in Google Webmaster Tools). It appears the data highlighter markup is not causing duplicates.
I'd recommend double-checking all the product pages you've added schema to that you originally had from the data highlighter markup. There may be duplicates, there may not. To be honest, I've always gone right to schema.org, but checking for duplicates should be the only thing you should have to worry about.
Good luck!
-
only on the product pages. it's live.
-
Where have you added schema markup? What pages?
After briefly scanning a few pages on your site, I'm not seeing any code from schema.org in the page source. Has this been implemented/gone live yet?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does educational organization schema interact with Google's knowledge graph?
Hi there! I was just wondering if the granular options of the Organization schema, like Educational Organization (http://schema.org/EducationalOrganization) and CollegeOrUniversity (http://schema.org/CollegeOrUniversity) schema work the same when it comes to pulling data into the knowledge graph. I've typically always used the Organization schema for customers but was wondering if there are any drawbacks for going deep into the hierarchy of schema. Cheers 😄
Intermediate & Advanced SEO | | Corbec8880 -
Schema for Knowledge Graph Card
Hello, all! I have a client who's Fortune 500 - has all the good "stuff" that is associated with pulling in proper info into the knowledge graph/company information box - Wikipedia, strong citations, etc., but the card is showing the wrong company type! Has anyone had experience with influencing this via Schema or anything else clever? Note that the correct company type is referenced in all the usual spots. Thanks!
Intermediate & Advanced SEO | | SimpleSearch1 -
Would adding and abbreviation to a title hurt?
I have been trying to figure this out- https://moz.com/community/q/what-should-i-do-with-old-e-commerce-item-pages I added n/a to the end of the page titles so I could figure out how these pages were performing. Since I added them my organic traffic has seemed to have dropped. It has only been a few days so maybe it is an anomaly. Everything else has stayed the same, would this cause an organic traffic drop?
Intermediate & Advanced SEO | | EcommerceSite0 -
Google Penalty Checker Tool
What is the best tool to check for the google penalty, What penalty hit the website. ?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Should I set a max crawl rate in Webmaster Tools?
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue). Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing? I found this post on Moz, but it's dated from 2008. Any thoughts on this?
Intermediate & Advanced SEO | | LiamMcArthur0 -
Google Webmaster Tools Set-up
Hello! My URL of www.morganlindsayphotography.com I have set up in google webmaster tools. I also have added the instance of morganlindsayphotography.com (non www version) My blog is located at www.morganlindsayphotography.com/blog/ My question is do I add a sitemap to the "<a>www.morganlindsayphotography.com/blog/</a>" as well as the <a>www.morganlindsayphotography.com</a> ? Thank you!
Intermediate & Advanced SEO | | 393morgan0 -
Www vs. non-www differences in crawl errors in Webmaster tools...
Hey All, I have been working on an eCommerce site for a while that to no avail, continues to make me want to hang myself. To make things worth the developers just do not understand SEO and it seems every change they make just messes up work we've already done. Job security I guess. Anywho,most recently we realized they had some major sitemap issues as almost 3000 pages were submitted by only 20 or so were indexed. Well, they updated the sitemap and although all the pages are properly indexing, I now have 5000+ "not found" crawl errors in the non-www version of WMT and almost none in the www version of the WMT account. Anyone have insight as to why this would be?
Intermediate & Advanced SEO | | RossFruin0 -
Adding Orphaned Pages to the Google Index
Hey folks, How do you think Google will treat adding 300K orphaned pages to a 4.5 million page site. The URLs would resolve but there would be no on site navigation to those pages, Google would only know about them through sitemap.xmls. These pages are super low competition. The plot thickens, what we are really after is to get 150k real pages back on the site, these pages do have crawlable paths on the site but in order to do that (for technical reasons) we need to push these other 300k orphaned pages live (it's an all or nothing deal) a) Do you think Google will have a problem with this or just decide to not index some or most these pages since they are orphaned. b) If these pages will just fall out of the index or not get included, and have no chance of ever accumulating PR anyway since they are not linked to, would it make sense to just noindex them? c) Should we not submit sitemap.xml files at all, and take our 150k and just ignore these 300k and hope Google ignores them as well since they are orhpaned? d) If Google is OK with this maybe we should submit the sitemap.xmls and keep an eye on the pages, maybe they will rank and bring us a bit of traffic, but we don't want to do that if it could be an issue with Google. Thanks for your opinions and if you have any hard evidence either way especially thanks for that info. 😉
Intermediate & Advanced SEO | | irvingw0