Fetch as Google - removes start words from Meta Title ?? Help!
-
Hi all,
I'm experiencing some strange behaviour with Google Webmaster Tools.
I noticed that some of our pages from our ecom site were missing start keywords - I created a template for meta titles that uses Manufacturer - Ref Number - Product Name - Online Shop; all trimmed under 65 chars just in case. To give you an idea, an example meta title looks like:
Weber 522053 - Electric Barbecue Q 140 Grey - Online ShopThe strange behaviour is if I do a "Fetch as Google" in GWT, no problem - I can see it pulls the variables and it's ok. So I click submit to index.
Then I do a google site:URL search, to see what it has indexed, and I see the meta description has changed (so I know it's working), but the meta title has been cut so it looks like this:
Electric Barbecue Q 140 Grey - Online ShopSo I am confused - why would Google cut off some words at start of meta title? Even after the Fetch as Googlebot looks perfectly ok?
I should point out that this method works perfect on our other pages, which are many hundreds - but it's not working on some pages for some weird reason....
Any ideas?
-
If you use the words Weber 522053 in the query, Google is likely to return the title with Weber 522053 in it, even if it doesn't return it that way for a different query, because Google sees that Weber 522053 is important to the querier.
But this does show that Google knows what the intended title is and is shortening it for its own unfathomable Google-ish reasons. (I did notice that Weber 522053 is not visible on the page at all, so possibly that is what makes Google think that it is not important information to display in the serps.)
-
Looks ok to me. I did a search for (with the quotes) "Weber 522053 - Electric Barbecue Q 140 Grey - Online Shop"
The Title was the full title as you described. I think tried doing a site: search on the the domain and the specific url and again saw the full title.
Google can do some weird stuff tot he title if it thinks that that the title does not accurately reflect the content of the page or its relevance to the search query.
This one is looking ok to me though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need Advice - Google Still Not Ranking
Hi Team - I really need some expert level advice on an issue I'm seeing with our site in Google. Here's the current status. We launched our website and app on the last week of November in 2014 (soft launch): http://goo.gl/Wnrqrq When we launched we were not showing up for any targeted keywords, long tailed included, even the title of our site in quotes. We ranked for our name only, and even that wasn't #1. Over time we were able to build up some rankings, although they were very low (120 - 140). Yesterday, we're back to not ranking for any keywords. Here's the history: While developing our app, and before I took over the site, the developer used a thin affiliate site to gather data and run a beta app over the course of 1 - 2 years. Upon taking on the site and moving to launch the new website/app I discovered what had been run under the domain. Since than the old site has been completely removed and rebuild, with all associated urls (.uk, .net, etc...) and subdomains shutdown. I've allowed all the old spammy pages (thousands of them to 404). We've disavowed the old domains (.net, .uk that were sending a ton of links to this), along with some links that seemed a little spammy that were pointing to our domain. There are no manual actions or messaged in Google Webmaster Tools. The new website uses (SSL) https for the entire site, it scores a 98 / 100 for a mobile usability (we beat our competitors on Google's PageSpeed Tool), it has been moved to a business level hosting service, 301's are correctly setup, added terms and conditions, have all our social profiles linked, linked WMT/Analytics/YouTube, started some Adwords, use rel="canonical", all the SEO 101 stuff ++. When I run the page through the moz tool for a specific keyword we score an A. When I did a crawl test everything came back looking good. We also pass using other tools. Google WMT, shows no html issues. We rank well on Bing, Yahoo and DuckDuckGo. However, for some reason Google will not rank the site, and since there is no manual action I have no course of action to submit a reconsideration request. From an advanced stance, should we bail on this domain, and move to the .co domain (that we own, but hasn't been used before)? If we 301 this domain over, since all our marketing is pointed to .com will this issue follow us? I see a lot of conflicting information on algorithmic issues following domains. Some say they do, some say they don't, some say they do since a lot of times people don't fix the issue. However, this is a brand new site, and we're following all of Google's rules. I suspect there is an algorithmic penalty (action) against the domain because of the old thin affiliate site that was used for the beta and data gathering app. Are we stuck till Google does an update? What's the deal with moving us up, than removing again? Thoughts, suggestions??? I purposely, did a short url to leave out the company name, please respect that, since I don't want our issues to popup on a web search. 🙂
Algorithm Updates | | get4it0 -
Google Trends Graph and KW Planner Monthly Searches?
I'm trying to show people the trends of certain keywords/topics over a period of years Keyword Planner gives some actual numbers but only for 12 months. Trends will show "Numbers represent search interest relative to the highest point on the chart. If at most 10% of searches for the given region and time frame were for "pizza," we'd consider this 100. This doesn't convey absolute search volume." Which I don't really understand, other than if the graph goes up it means more interest but has to do with the amount of people searching, location, etc which can get tricky? I'd like to put together a short report explaining certain topics and how interest in them has increased over the last 5+ years. I'm hoping someone else here has had some experience with this and has some advice or links with more information?
Algorithm Updates | | JoshBowers20120 -
How can I check Googles Page Cache ?
Hi I use to have a handy tool in Firefox (Google Toolbar) that was very handy for checking page ranks and what date a page had been cached. For a while with the newer versions of Firefox I cannot seem to locate this useful tool, Can anybody recommend any useful tools for checking the above. Thanks Adam
Algorithm Updates | | AMG1000 -
Does Google or Bing use words in the page title beyond the displayed limit for ranking purposes?
Standard good practice for on-page SEO includes keeping page title length below the maximum that Google displays in the SERPs. But words in the title beyond that maximum can be indexed, even if they don't show in the SERPs for end users. For ranking purposes, is there any value in words beyond the character limit in page titles that are truncated in the SERPs?
Algorithm Updates | | KyleJB0 -
Do you think Google is destroying search?
I've seen garbage in google results for some time now, but it seems to be getting worse. I was just searching for a line of text that was in one of our stories from 2009. I just wanted to check that story and I didn't have a direct link. So I did the search and I found one copy of the story, but it wasn't on our site. I knew that it was on the other site as well as ours, because the writer writes for both publications. What I expected to see was the two results, one above the other, depending on which one had more links or better on-page for the query. What I got didn't really surprise me, but I was annoyed. In #1 position was the other site, That was OK by me, but ours wasn't there at all. I'm almost used to that now (not happy about it and trying to change it, but not doing well at all, even after 18 months of trying) What really made me angry was the garbage results that followed. One site, a wordpress blog, has tag pages and category pages being indexed. I didn't count them all but my guess is about 200 results from this blog, one after the other, most of them tag pages, with the same content on every one of them. Then the tag pages stopped and it started with dated archive pages, dozens of them. There were other sites, some with just one entry, some with dozens of tag pages. After that, porn sites, hundreds of them. I got right to the very end - 100 pages of 10 results per page. That blog seems to have done everything wrong, yet it has interesting stats. It is a PR6, yet Alexa ranks it 25,680,321. It has the same text in every headline. Most of the headlines are very short. It has all of the category and tag and archive pages indexed. There is a link to the designer's website on every page. There is a blogroll on every page, with links out to 50 sites. None of the pages appear to have a description. there are dozens of empty H2 tags and the H1 tag is 80% through the document. Yet google lists all of this stuff in the results. I don't remember the last time I saw 100 pages of results, it hasn't happened in a very long time. Is this something new that google is doing? What about the multiple tag and category pages in results - Is this just a special thing google is doing to upset me or are you seeing it too? I did eventually find my page, but not in that list. I found it by using site:mysite.com in the search box.
Algorithm Updates | | loopyal0 -
Help, I am in Local Search Results!
I do not know what to do with this... and could use a bit of advice on this issue: "Doing things right", resulted in great organic rankings and a bonus by showing top of local search results for our area. Sounds great... until Google decides it was time to mix things up a little. I do not know if this applies to all types of businesses, but for ours it means that you will no longer get any organic page 1 listing if you are a local business that (un)luckily ranks in local results too. One day G will include local results on a keyword, the next they won't... making our SEOMoz Campaign rankings weekly a true yo-yo of "50 keywords declined by >48 and >49 places", and "30 keywords improved by <47 and <49". It turned this feature in campaigns completely useless for me (ever since SEOMoz decided to include the local result light bulb that is) Some traffic dropped from 240 a day for one keyword, to 30 now for that same keyword. Frustrated? You bet. I do not understand why Google seems to create a war with local businesses. Should we get out of Local results or does anyone have any ideas, suggestions? Thanks a bunch guys!
Algorithm Updates | | Discountvc3 -
Domain Authority and Google keywords
Hi there, We have a domain authority of 33, one of our competitors has an authority of 10, yet they appear to list higher on many keyword searches in google. Is there a reason for this? Our site is 5 months old, and their site is over 3 yrs old. Thanks for your feedback 🙂
Algorithm Updates | | PHDAustralia680 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0