Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practices for Homepage Title Tag
-
Hi,
I would like to know if there is any update about the best practices for the homepage title tag.
I mean, a couple of years ago, it was still working placing main keywords in the homepage title tag. But since the last google SERP update, the number of characters that are being shown were reduced, and now we try to work with 55 and 56 characters. That has reduced our capacity of including many keywords on the title tag.
Besides, search engines are smarter now to choose the correct inner page to show in SERP.
But I am wondering if the Homepage Title should have a branded orientation or should include main keywords, cause it is still working that strategy.
I would appreciatte any update in this issue.
Thank you!
-
Thanks again!
-
Correct - I can give you a trick though.
If the SERP is a high value page. Thousands if not millions of dollars has been spent on Adwords A/B testing the Ads that work on that page. When you frame your meta description and Title if you can - take into account the top Ads that companies keep on replaying. They would not keep running them, if not highly successful on that page.
Go get them...
-
Thank you!
-
Thank you Tom!
For sure, a ctr optimized title works better. I still don't know if having less kws in title tag pays the worth...
I still don't know, what would be better
Attractive title, but less keywords.
or
less attractive title and more keywordsSpanish language makes it a little more difficult, cause generally words are longer, and you cannot say too much...
Maybe the only way is testing for each case, what works better.
I wish it were esier!Thank you!
-
Thank you John for your detailed answer! Very interesting insights
It seems that there is not easy way and not a general answer to this question. -
Interesting responses - we specialize in title tags and descriptions. There is no uniform practice as such. I disagree more with Tom on the above, but he is also right! The suggested method by Alick is I believe still generally the best way forward.
That said as Tom pointed out clickability should also be an integral feature in how you form the Title tag and description. So there is a trade off - and difficult often to find the balance SEO -v- Clickability. High traffic pages should have alot of thought and consideration - impacts can be massive.
The positive is with the new search traffic data available in WMT's you can try a few options over several weeks. In the new WMT's you can monitor each page more accurately and the effect of Position, Impressions, Clicks and CTR changes. Our experience is that with changes to the Title & Description & the subsequent Clicks on page google re-evaluates "the page relevance to the query" to answer a "searchers query". Google re-sets or re-tests you. Google either then "publishes the page on more or less searches" and google monitors searchers behavior on the page when people click through, for stickiness.
A good Title tag will have strong keyword elements and this can be be measured in WMT's as Google places the Result on more "searched pages". Immediately after indexing the page position may drop and likewise CTR. However the clicks go up. Why does this happen? It is because google believes the new result answers more searchers queries. Then the google tests how people respond to the page when they click through - if positive the page position climbs on the new pages - if there is no stickiness (ie they pogostick) it declines.
If google believes the new page is answering a "searchers query" then the page ranking generally will slowly increase, and likewise CTR.
Anyway maybe got a bit off track. But feel free to ask any questions. ps Yes I know google state CTR is not a ranking factor however they do take stock of what customers do on a page.
-
I disagree with the post above.
The most important thing for your title tag is to make it compelling enough to click. It's your biggest shop window - you need to use the space. A "Keyword - Keyword | Brand" isn't going to do that.
You will, of course, want to include your primary keyword in there, but you tell me which of these you'd prefer to click:
"Blue Widgets - Red Widgets | The Widgets Co"
"Cheap Blue Widgets - Free USA Shipping! | The Widgets Co"
Try and get your key selling points in the title tag as often as your keywords. Give the user a reason to click.
In addition, title tags are truncated/shortened based on character width, not the number characters. Dr Pete at Moz put together a great preview tool that you can check your title tags in to make sure they won't be shortened.
Hope this helps.
-
Hi,
Optimal format for any page title tag is **Primary Keyword - Secondary Keyword | Brand Name. **
You can use it same for homepage also. If a brand is well-known enough to make a difference in click-through rates in search results, the brand name should be first. If the brand is less known or relevant than the keyword, the keyword should be first.
If you keep your titles under 55 characters, you can expect at least 95% of your titles to display properly
Hope this helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Structure & Best Practice when Facing 4+ Sub-levels
Hi. I've spent the last day fiddling with the setup of a new URL structure for a site, and I can't "pull the trigger" on it. Example: - domain.com/games/type-of-game/provider-name/name-of-game/ Specific example: - arcade.com/games/pinball/deckerballs/starshooter2k/ The example is a good description of the content that I have to organize. The aim is to a) define url structure, b) facilitate good ux, **c) **create a good starting point for content marketing and SEO, avoiding multiple / stuffing keywords in urls'. The problem? Not all providers have the same type of game. Meaning, that once I get past the /type-of-game/, I must write a new category / page / content for /provider-name/. No matter how I switch the different "sub-levels" around in the url, at one point, the provider-name doesn't fit as its in need of new content, multiple times. The solution? I can skip "provider-name". The caveat though is that I lose out on ranking for provider keywords as I don't have a cornerstone content page for them. Question: Using the URL structure as outlined above in WordPress, would you A) go with "Pages", or B) use "Posts"
Intermediate & Advanced SEO | | Dan-Louis0 -
H Tags in Menu
Hi I am checking the H2 tags on this page https://www.key.co.uk/en/key/dollies-load-movers-door-skates I have noticed my dev team have implemented H2's on the categories in the menu. Will this completely confuse Google as to what that page is about? In my opinion those links shouldn't be heading tags at all
Intermediate & Advanced SEO | | BeckyKey0 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Best Practice for Inter-Linking to CCTLD brand domains
Team, I am wondering what people recommend as best SEO practice to inter-link to language specific brand domains e.g. : amazon.com
Intermediate & Advanced SEO | | tomypro
amazon.de
amazon.fr
amazon.it Currently I have 18 CCTLDs for one brand in different languages (no DC). I am linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD doamin. However, with Google's discouragement of site-wide links I am reviewing this practice. I am tending towards making the language redirects on each page javascript driven and to start linking only from my home page to the other pages with optimized link titles. Anyone having any thoughts/opinions on this topic they are open to sharing? /Thomas0 -
Best practice to redirects based on visitors' detected language
One of our websites has two languages, English and Italian. The English pages are available at the root level:
Intermediate & Advanced SEO | | Damiano
www.site.com/ English homepage www.site.com/page1
www.site.com/page2 The Italian pages are available under the /it/ level:
www.site.com/it Italian homepage www.site.com/it/pagina1
www.site.com/it/pagina2 When an Italian visitor first visits www.mysit.com we'd like to redirect it to www.site.com/it but we don't know if that would impact search engine spiders (eg GoogleBot) in any way... It would be better to do a Javascript redirect? Or an http 3xx redirect? If so, which of the 3xx redirect should we use? Thank you0