H1 tag proper uses
-
Ok I see this happening all of the time. I get my hands on a new website and there are one of four header tag issues:
1. There are no H1 tags at all
2. There are multiple H1 tags on the same page
3. Every page has an identical H1 tag
4. Header tags are used all out of order
Do any of these have a negative impact on rankings? I've always tried to get one H1 tag on each page, have it be the first header tag, and make it unique to each page. Is this a waste of itme? Could improper header tag use hurt a website?
-
Content, alt tags, anchor text to internal pages are always favorites : )
-
So that means baisicly that the H1 is part of the design session and no longer part (no longer also part )of the SEO sessions.
So is there any on-page SEO element still important for ranking ? as it looks like only content and links from outside sources are the only things that counts.
-
great links
-
Hs have little influence on rankings at this moment. However, as for good writing practice, the use of a single H1 tag as the first Header tag is always recommended for best practice. H1s should not all be the same and should represent the page contents so the reader has a visual cue that (s)he is on the correct page.
-
Latest SEOmoz correlation data showed that h1 use is not a significant ranking factor (or at least not as a significant ranking factor) as once thought. I'll try and dig that up for you.
UPDATE - Here's the links I was thinking of - http://www.seomoz.org/blog/google-vs-bing-correlation-analysis-of-ranking-elements - and - http://www.seomoz.org/blog/whiteboard-friday-the-biggest-seo-mistakes-seomoz-has-ever-made - BONUS - http://www.seomoz.org/blog/bing-vs-google-prominence-of-ranking-elements
Matt Cutts has done a video on this as well - http://www.youtube.com/watch?v=GIn5qJKU8VM
Basically use the where it makes sense, I try to avoid it in the logo of the page and have just one h1 text header, but there have been times when a couple more may have made sense.
-
Hi Dan,
you can see a Matt Cutts video about this (see the attachment).In my opinion it's better to focus on one H1 per page in order to focus the attention of the crawler on only few main keywords. All H1 have to be different. Take care about pagination, since you can easly duplicate meta tags and H1. In order to avoid it, use the meta FOLLOW,NOINDEX and use the canonical tag.
Luca
-
I have not seen negative impact due to H1 tag as much as I have seen positive. H1 tag optimization used to be popular optimization technique few years ago until lately it has been downgraded in priority by search engines. The top ranking factors are -
-
domain authority
-
link quality and quantity
-
social signals
-
title tag keyword usage
-
Anchor text from backlinks
http://www.seomoz.org/article/search-ranking-factors
Note- Keyword is title is way more important that keyword is H1 tag.
-
-
Good question - I'm also interested in getting an answer on this. I am also trying to get one single h1 tag, unique, on each page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When do you use article markup for AMP pages?
Hi all! For a healthcare website we have setup AMP. Google Search Console suggests to use article markup for several pages and I am not sure if this is correct. There are two kind of pages:
Intermediate & Advanced SEO | | DeptAgency
1. News pages
2. Information pages, for example: symptoms alcohol addiction or Binge Eating Disorder There's no doubt the article markup will be correct for the news pages but I am not sure about the information pages. Do you guys suggest to implement article markup on these pages as well or only use this for real news/blog posts? Hope you can help me out. Thank you in advance and happy holidays! Regards, Anouk van de Velde0 -
Canonicle & rel=NOINDEX used on the same page?
I have a real estate company: www.company.com with approximately 400 agents. When an agent gets hired we allow them to pick a URL which we then register and manage. For example: www.AGENT1.com We then take this agent domain and 301 redirect it to a subdomain of our main site. For example
Intermediate & Advanced SEO | | EasyStreet
Agent1.com 301’s to agent1.company.com We have each page on the agent subdomain canonicled back to the corresponding page on www.company.com
For example: agent1.company.com canonicles to www.company.com What happened is that google indexed many URLS on the subdomains, and it seemed like Google ignored the canonical in many cases. Although these URLS were being crawled and indexed by google, I never noticed any of them rank in the results. My theory is that Google crawled the subdomain first, indexed the page, and then later Google crawled the main URL. At that point in time, the two pages actually looked quite different from one another so Google did not recognize/honor the canonical. For example:
Agent1.company.com/category1 gets crawled on day 1
Company.com/category1 gets crawled 5 days later The content (recently listed properties for sale) on these category pages changes every day. If Google crawled the pages (both the subdomain and the main domain) on the same day, the content on the subdomain and the main domain would look identical. If the urls are crawled on different days, the content will not match. We had some major issues (duplicate content and site speed) on our www.company.com site that needed immediate attention. We knew we had an issue with the agent subdomains and decided to block the crawling of the subdomains in the robot.txt file until we got the main site “fixed”. We have seen a small decrease in organic traffic from google to our main site since blocking the crawling of the subdomains. Whereas with Bing our traffic has dropped almost 80%. After a couple months, we have now got our main site mostly “fixed” and I want to figure out how to handle the subdomains in order to regain the lost organic traffic. My theory is that these subdomains have a some link juice that is basically being wasted with the implementation of the robots.txt file on the subdomains. Here is my question
If we put a ROBOTS rel=NOINDEX on all pages of the subdomains and leave the canonical (to the corresponding page of the company site) in place on each of those pages, will link juice flow to the canonical version? Basically I want the link juice from the subdomains to pass to our main site but do not want the pages to be competing for a spot in the search results with our main site. Another thought I had was to place the NOIndex tag only on the category pages (the ones that seem to change every day) and leave it off the product (property detail pages, pages that rarely ever change). Thank you in advance for any insight.0 -
Pagination loading with using AJAX. Should I change this?
Hello, while I was checking this site; http://www.disfracessimon.com/disfraces-adultos-16.html I found that the pagination is working this way http://www.disfracessimon.com/disfraces-adultos-16.html#/page-2
Intermediate & Advanced SEO | | teconsite
http://www.disfracessimon.com/disfraces-adultos-16.html#/page-3 and content is being loaded using AJAX. So, google is not getting the paginated results. Is this a big issue or there is no problem?
Should I create a link for See All Products or there is not a big issue? Thank you!0 -
Hreflang Tags with Errors in Google Webmaster Tools
Hello, Google Webmaster tools is giving me errors with Hreflang tags that I can't seem to figure out... I've double checked everything: all the alternate and canonical tags, everything seems to match yet Google finds errors. Can anyone help? International Targeting | Language > 'fr' - no return tags
Intermediate & Advanced SEO | | GlobeCar
URLs for your site and alternate URLs in 'fr' that do not have return tags.
Status: 7/10/15
24 Hreflang Tags with Errors Please see attached pictures for more info... Thanks, Karim KQgb3Pn0 -
Use a language extension or a keyword as an extension?
If it's technically necessary to add an extension to a domain URL... Should I use brand.nl/nl or should I use brand.nl/keyword as the homepage? In my opinion it's better to use the language extension as it is much easier for other websites to link to. The client could make a separate page with content about the keyword. I also think it's much more difficult for direct traffic to access the website with this long URL. Any other thoughts?
Intermediate & Advanced SEO | | WeAreDigital_BE0 -
Using Meta Header vs Robots.txt
Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?
Intermediate & Advanced SEO | | evan890 -
Anchor Tag around Table / Block
Our homepage (here) has four large promotional sections taking up most of the real estate. Each promo section has an image and styled text. We want each promo section to link to the appropriate page, so we created the promo sections as and wrapped each in an anchor. That works fine for users but I tried viewing our site in a text-only browser (Lynx) and couldn't follow those links! My fear is that GoogleBot can't follow them either and doesn't know what anchor text to pull. So, my question: What's the best way to make this entire block clickable, but still have it crawlable by robots? Or is our current implementation ok? For reference, here's a simplified version of the relevant code block: |
Intermediate & Advanced SEO | | Richline_Digital| All Diamonds Extra 20% Off | [|
| Jessica Simspon Extra 20% Off |](http://jessicasimpson.jewelry.com/shop/)
0 -
All page files in root? Or to use directories?
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
Intermediate & Advanced SEO | | Peter264
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following: /images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!0