Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does Google ACTUALLY ding you for having long Meta Titles? Or do studies just suggest a lower CTR?
-
I do SEO in an agency and have many clients. I always get the question, "Will that hurt my SEO?". When it comes to Meta Title and even Meta Description Length, I understand Google will truncate it which may result in a lower CTR, but does it actually hurt your ranking? I see in many cases Google will find keywords within a long meta description and display those and then in other cases it will simply truncate it. Is Google doing whatever they want willy-nilly or is there data behind this?
Thank you!
-
I think meta descriptions are important.
They are your first chance to display a call to action to a customer and to get them to click through to your site. Hence a poorly written one, truncated etc. is probably not as enticing as one within the 160 characters - that does not truncate.
We have acted for several clients where we have optimized the MD and improved the CTR by .08% (ie less than 1%) but that has amounted to over 20,000 additional clicks on their site a year.
Also I loved Rand's WBF which indirectly addresses the issue, but correlates with my view, though probably not as strong that dwell time is a significant factor on ranking.
https://moz.com/blog/impact-of-queries-and-clicks-on-googles-rankings-whiteboard-friday
On your questions directly:-
Will it hurt your SEO? - Yes, two possible reasons
1/ you keyword stuff it.
2/ no-one clicks through because you have a bad MD
On truncation - there are exceptions, but google generally does not if you fit within there pixel/character limit.
My view - draft and implement your MD's properly...
Hope that assists.
-
Great question, and I certainly heard the "will this hurt my seo" thing all the time as a consultant. A couple of thoughts...
- To my knowledge, there is no specific algorithmic feature that would lower a page's rank because of too long descriptions
- Long meta descriptions, however, may be truncated (as you pointed out) or ignored and replaced altogether by Google if they find a more appropriate subsection of text on the page.
- A succinct, well written meta description may help with CTR which itself may be a ranking factor
- Google has stated that they want you to write good meta descriptions, for what it is worth.
What I try and say to clients is "are you prepared to build a top 10 website in your industry". If they are sweating good meta descriptions, they aren't ready to compete in the big leagues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Cache
So, when I gain a link I always check to see if the page that is linking is in the Google cache. I've noticed recently that more and more pages are actually not showing up in Google's cache, yet still appear in search results. I did read an article from someone whoo works at Google a few weeks back that there is sometimes an error with the cache and occasionally the cache will not display. This week, my own website isn't showing up in the cache yet I'm still ranking in SERP's. I'm not worried about it, mostly whitehat, but has there been any indication that Google are phasing out the ability to check cache's of websites?
Algorithm Updates | | ThorUK0 -
Using Google to find a discontinued product.
Hi Guys. I mostly use this forum for business questions, but now it's a personal one! I'm trying to find a supplier that might still have discontinued product. It's the Behritone C5A speaker monitor. All my searches bring up a plethora of pages that appear to sell the product... but they have no stock. (Wouldn't removing these pages make for a better internet?) No 2nd hand ones on eBay 😞 Do you have any suggestion about how I can get more relevant results... i.e find supplier that might still have stock? Any tips or trick I may be able to use to help me with this? Many thanks in advance to an awesome community 🙂 Isaac.
Algorithm Updates | | isaac6631 -
Does Google use dateModified or date Published in its SERPs?
I was curious as to the prioritization of dateCreated / datePublished and dateModified in our microdata and how it affects google search results. I have read some entries online that say Google prioritizes dateModified in SERPs, but others that claim they prioritize datePublished or dateCreated. Do you know (or could you point me to some resources) as to whether Google uses dateModified or date Published in its SERPs? Thanks!
Algorithm Updates | | Parse.ly0 -
Meta Keyword Tags
What is the word on Meta Keyword Tags? Are they good to have, or bad? Our biggest competitor seems to have them.
Algorithm Updates | | Essential-Pest0 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
Proper Way To Submit A Reconsideration Request To Google
Hello, In previous posts, I was speaking about how we were penalized by Google for unnatural links. Basically 50,000 our of our 58,000 links were coming from 4-5 sites with the same exact anchor text and img alt tags. This obviously was causing our issues. Needless to say, I wen through the complete link profile to determine that all of the links besides this were of natrural origins. My question here is what is the accepted protocol of submitting a reinclusion request; For example, how long should it be? Should I disclose that I was in fact using paid links, and now that I removed (or at least nofollowed) them? I want to make sure that the request as good as it should so I can get our rankings up in a timely manner. Also, how long until the request is typically aknowledged? Thanks
Algorithm Updates | | BestOdds0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
To use the same content just changing the keywords could be seen as duplicate content?
I want to offer the same service or product in many different cities, so instead of creating a new content for each city what I want to do it to copy the content already created for the product and service of a city and then change the name of the city and create a new url inside my website for each city. for example let say I sell handmade rings in the USA, but I want o target each principal city in the USA, so I have want to have a unque url for ecxh city so for example for Miami I want to have www.mydomain.com/handmade-rings-miami and for LA the url would be www.mydomain.com/handmade-rings-la Can I have the same content talking about the handmade rings and just change the keywords and key phrases? or this will count as a duplicate content? content: TITLE: Miami Handmade Rings URL :www.mydomain.com/handmade-rings-miami Shop Now handmade rings in Miami in our online store and get a special discount in Miami purchases over $50 and also get free shipping on Miami Local address... See what our Miami handmade rings clients say about our products.... TITLE: LA Handmade Rings URL: www.mydomain.com/handmade-rings-la Shop Now handmade rings in LA in our online store and get a special discount in LA purchases over $50 and also get free shipping on LA Local address... See what our LA handmade rings clients say about our products.... There are more than 100 location in the country I want to do this, so that is why I want to copy paste and replace.. Thanks in advance, David Orion
Algorithm Updates | | sellonline1230