H1 and Schema Codes Set Up Correctly?
-
Greetings:
It was pointed out to me that the h1 tags on my website (www.nyc-officespace-leader.com) all had exactly the same text and that duplication may be contributing to the very low page authority for most URLs.
The duplicate h1 appears in line 54-54 (see below) of the home page: www.nyc-officespace-leader.com:
itemscope itemtype="http://schema.org/LocalBusiness" style="position:absolute;top:-9999em;">
<span<br>itemprop="name">Metro Manhattan Office Space</span<br>
<img< p="">But the above refers to schema" so is this really duplicate H1 or is there an exception if the H1 is within a schema?
Also, I was told that the company street address and city and state were set up incorrectly as part of an alt tag. However these items also appear as schema in lines 49-68 shown below:
Dangerous for me to perform surgery on the code without being certain about these key items!! Could ask my developer, however they may be uncomfortable considering that they set this up in the 1st place. So the view of neutral professionals would be highly welcome!
itemprop="address" itemscope itemtype="http://schema.org/PostalAddress">
<span<br>itemprop="streetAddress">347 5th Ave #1008
<span<br>itemprop="addressLocality">New York
<span<br>itemprop="addressRegion">NY
<span<br>itemprop="postalCode">10016<div<br>itemprop="brand" itemscope itemtype="http://schema.org/Organization">
---------------------------------------------------------------------------</div<br></span<br></span<br></span<br></span<br></img<> -
For suggestion 1, I should clarify that you already are using Microdata. Your Microdata is repeating what is already in the page, rather than "tagging" your existing content inline. Microdata is a good tool to use if you are able to tag pieces of content as you are communicating it to a human reader; it should follow the natural flow of what you are writing to be read by humans. This guide walks you through how Microdata can be implemented inline with your content, and it's worth reading through to see what's available and how to step forward with manual implementation of Schema.org with confidence.
Will these solutions remove the duplicate H1 tag?
Whatever CMS or system you are using to produce the hidden microdata markup needs to be changed to remove its attempt entirely. The markup of the content itself is good, but it needs to be combined in with existing content or implemented with JSON+LD so that it is not duplicating the HTML you are showing the user.
Are these options relatively simple for an experienced developer? Is one option superior to the other?
Both should be, but it depends on your strategy. Are you hand-rolling your schema.org markup? Is somebody going into your content and wrapping the appropriate content with the correct microdata? This can be a pain in the butt and time-consuming, especially if they're not tightly embedded with your content production team.
I downloaded the HTML and reviewed the Microdata implementation. I don't mean to sound unkind but it looks like computer-generated HTML and it's pretty difficult to read and manipulate without matching tags properly.
Is one option superior to the other?
Google can read either without issue; they recommend JSON+LD (source).
In your case, I'd also recommend JSON+LD because:
- Your investment in Microdata is not very heavy and appears easy enough to unwind
- The content you want to show users isn't exactly inline with the content you want read by crawlers anyway (for example, your address isn't on the page and visible to readers)
- It's simple enough to write by hand, and there exist myriad options to embed programmatically-generated schema.org content in JSON+LD format
Please review this snippet comparing a Microdata solution and a JSON+LD solution side by side.
PLEASE DO NOT COPY AND PASTE THIS INTO YOUR SITE. It is meant for educational and demonstrative purposes only.
There are comments inline that should explain what's going on: https://gist.github.com/TheDahv/dc38b0c310db7f27571c73110340e4ef
-
Hi Again:
Will option #1 (keeping existing microdata) remove the duplicate h1 tag? Your suggestion listed below:
"So, wherever the
tag with the company name lives that is rendered and shown to the user, ad the "LocalBusiness" itemscope to the parent tag that surrounds it and its content. Basically you'd merge your Schema.org code with the user-facing content"
-
Hi David:
Schema was added to the site discretely provide location data to Google.
You suggested 2 potential solutions:
1. Use Microdata...
2. Use JSON+LD..
Will these solutions remove the duplicate H1 tag?
We are concerned that the low rank of our URLs (80% are 1) are caused by duplicate H1s on each page.
Are these options relatively simple for an experienced developer? Is one option superior to the other?
Thanks for your patience in explaining these options, my programming understanding is limited.
Alan -
I see that you're using CSS to get that markup into the page, but definitely not visible to the user. Am I interpreting that right? If so, it seems like your goal is to get some Schema.org tags into the page to mark up your content as a LocalBusiness.
I have 2 ideas for you:
Use microdata (the markup format you're using now) to mark up your tags inline with your existing content. So, wherever the
tag with the company name lives that is rendered and shown to the user, ad the "LocalBusiness" itemscope to the parent tag that surrounds it and its content. Basically you'd merge your Schema.org code with the user-facing content
Use JSON+LD markup instead. You can get the same information "repeated" but the JSON+LD markup isn't rendered for users. jsonld.com has a great page with a template you can copy and adjust to suit your business. If you go this route, remove the microdata-laden HTML hidden off the page with the inline CSS and replace it with the JSON+LD wrapped in . Google also has some great documentation around the LocalBusiness type.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I data highlight pages that already have schema?
Hi all, I have pages with schema on but there are some gaps. Rather than ask my dev team / wait for the changes to be made, can I use the data highlighting tool in GSC to fill in these gaps? Will it let me add these and will Google generally consider both the schema and the highlighted data? To note, if I have used GSC to highlight data and then test it in Google's Structured Markup Test Tool it won't show so I understand it may be difficult to test whether it's working or not. Any advice would be appreciated. Thanks!
Intermediate & Advanced SEO | | KJH-HAC0 -
Schema for E-Commerce websites
Hi Guys. I am running a cleanup for the on page schema we use and will be moving the on page elements into tag manager. I have all the metas and schema for the products boxed off. My question today is what schema should I use for category pages. Granted there is Json-LD for aggregated reviews but I cant see or work out how or what to use for the category pages that have the lists of products on. Any assistance appreciated. Alex
Intermediate & Advanced SEO | | JBGlobalSEO1 -
SEO - Use pages on main site or set up outside keyword rich domains and websites
I have a client who is wanting to target searches for competitors products. His idea was to purchase domains related to the searches he's targeting (for example, people looking for another company's app) and to build out one page websites addressing the search query and why a customer would choose his app solution over a competitor. I know he'd have to build a handful of links to each site for any chance of success but I wanted to ask the following.. Would doing this be better than just building pages addressing the searches on his main website domain? Is there an SEO risk to doing this? Potential for a penalty? Anything we need to do to structure these in a way that won't violate Google's SEO guidelines? Any other thoughts on pros and cons of each strategy? Thank you! Ricky
Intermediate & Advanced SEO | | RickyShockley0 -
URL Parameters Settings in WMT/Search Console
On an large ecommerce site the main navigation links to URLs that include a legacy parameter. The parameter doesn’t actually seem to do anything to change content - it doesn’t narrow or specify content, nor does it currently track sessions. We’ve set the canonical for these URLs to be without the parameter. (We did this when we started seeing that Google was stripping out the parameter in the majority of SERP results themselves.) We’re trying to best strategize on how to set the parameters in WMT (search console). Our options are to set to: 1. No: Doesn’t affect page content’ - and then the Crawl field in WMT is auto-set to ‘Representative URL’. (Note, that it's unclear what ‘Representative URL’ is defined as. Google’s documentation suggests that a representative URL is a canonical URL, and we've specifically set canonicals to be without the parameter so does this contradict? ) OR 2. ‘Yes: Changes, reorders, or narrows page content’ And then it’s a question of how to instruct Googlebot to crawl these pages: 'Let Googlebot decide' OR 'No URLs'. The fundamental issue is whether the parameter settings are an index signal or crawl signal. Google documents them as crawl signals, but if we instruct Google not to crawl our navigation how will it find and pass equity to the canonical URLs? Thoughts? Posted by Susan Schwartz, Kahena Digital staff member
Intermediate & Advanced SEO | | AriNahmani0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
Https://www.mywebsite.com/blog/tag/wolf/ setting tag pages as blog corner stone article?
We do not have enough content rich page to target all of our keywords. Because of that My SEO guy wants to set some corner stone blog articles in order to rank them for certain key words on Google. He is asking me to use the following rule in our article writing(We have blog on our website):
Intermediate & Advanced SEO | | AlirezaHamidian
For example in our articles when we use keyword "wolf", link them to the blog page:
https://www.mywebsite.com/blog/tag/wolf/
It seems like a good idea because in the tag page there are lots of material with the Keyword "wolf" . But the problem is when I search for keyword "wolf" for example on the Google, some other blog pages are ranked higher than this tag page. But he tells me in long run it is a better strategy. Any idea on this?0 -
Best way to set up anchor text on parked pages?
Our company is no longer offering a series of products, much to the disappointment of our SEO team since we've spent a long time building up the pages and getting them ranked organically. The pages all have decent page rank and in some cases rank #1 for the primary keyword. We have a sister company that we acquired a year ago and they still offer these products on their website. They are a completely separate company with their own website which existed long before we acquired them and we have nothing to do with their website. Our team has proposed that rather than take down the URLs on our site for the products we no longer offer, to put a message saying something like "sorry we don't offer this anymore but you may be interested in this.." and then link to our sister company with anchor text so that they can get some benefit from our SEO efforts if we can't. The question/issue is how should we do that since there will be a lot of pages from the same domain, about 20 pages, all linking to a few pages on a different domain. Should the anchor text be varied unbranded or branded? On the one hand I think if we change up the anchor text used to link to another page many times from a single domain that looks strange and transparent to google. On the other hand unbranded text would be the better descriptor for users since we are deep linking to the product not the homepage of the other site.
Intermediate & Advanced SEO | | edu-SEO0 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0