How Come Meta is different based on different query?
-
We have a site we added a number to in the meta description. Once we did that we did a fetch as google to hopefully recrawl the page quicker. A few days later and we cleared W3 cache on WP and clear computer cache, then did search on common search for the site/page. WidgetA for example. The url is OurClient.com/widgetA/ - on organic in meta on SERP and we see our new meta with number.
We then do a search on a similar term WidgetingA for example and the same url shows:
OurClient.com/widgetA/ BUT THE meta description is different on SERP! It is the old meta. When we look at the element using mozbar, it shows the new meta as it should same as when we look at it under the original search term.
So, search for WidgetA, get new meta in serps and search for WidgetingA (which returns same url as WidgetA) and we get the old meta.
Thoughts???
-
Actually, both are reasonable in my opinion EGOL. I was stuck because my local person had put the numbers in and she kept saying they are not showing and I kept saying they are.
Here is the only other variable I failed to mention: there is a PPC ad on the query with the old meta term.Thanks,
-
I am willing to bet that old data is being served for WidgetingA. I believe that updates to the SERPs do not occur for all keywords at same time.
It could also be a sever on the tail end of a database update - since Google has thousands of severs responding to queries and updates to the databases do not propagate instantaneously.
Those are at least two explanations. There are probably more.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Different content on pages with the same URL--except one is at www and the other at www2
Hi! I have two pages with unique content on each. However, they have virtually the same URL--except one is a www and the other is a www2. As far as I know, both pages were meant to gain organic traction. How should this situation be handled for SEO purposes? Thanks for any help! ---Ivey
Intermediate & Advanced SEO | | Nichiha0 -
Meta refresh bad for SEO
Hi there, Some external developers have created a wishlist for a website that allows visitors to add products to a wishlist and then send an enquiry. Very similar set-up to a shopping basket really (without the payment option). However, this wishlist lives in a separate iframe and refreshes every 30 seconds to reflect any items visitors add to their wishlist. This refreshing is done with a meta refresh. I'm aware of the obvious usability issue that the visitor's product only appears after 30 seconds in their wishlist. However, are there also any SEO issues due to the refreshing of the iframe every 30 seconds? Please let me know, whether small or large issues.
Intermediate & Advanced SEO | | Robbern0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Approximate linking root domains we need based on these metrics
Our top 4 competitors for a single term we're targeting has the following metrics: PA 45, DA 89, 6 linking root domains to page, 40,000 linking root domains to domain PA 53, DA 100, 3 linking root domains to page, 1.6 million to domain PA 32 DA 37, 4 linking root domains to page, 200 to domain PA 55 DA 66, 6 linking root domains to page, 3300 to domain All other optimization is about the same, except in (2) they have half of the keyword phrase in the domain and the whole keyword phrase in the URL. Also everybody else has title and meta description with the plural form, and the singular is what I typed in. We have the whole keyword phrase in the domain. The above 4 sites were internal pages, ours is a home page rank. Our metrics: PA 33, DA 22, 30 linking root domains to page, 43 linking root domains to site How tough will it be for us to compete? How many strong linking root domains will it take?
Intermediate & Advanced SEO | | BobGW0 -
Best practice to redirects based on visitors' detected language
One of our websites has two languages, English and Italian. The English pages are available at the root level:
Intermediate & Advanced SEO | | Damiano
www.site.com/ English homepage www.site.com/page1
www.site.com/page2 The Italian pages are available under the /it/ level:
www.site.com/it Italian homepage www.site.com/it/pagina1
www.site.com/it/pagina2 When an Italian visitor first visits www.mysit.com we'd like to redirect it to www.site.com/it but we don't know if that would impact search engine spiders (eg GoogleBot) in any way... It would be better to do a Javascript redirect? Or an http 3xx redirect? If so, which of the 3xx redirect should we use? Thank you0 -
Link building maximum to different sub domains?
Hi All, I'm launching a new website with a number of country specific sub-domains and I wanted to know if Google will calculate the number of new links as a root domain or if it will treat each subdomain seperately? For instance if I built 50 links per month to each of my five proposed subdomains would google see it as 250 links built to one root domain(and penalise me as a result) or will they view these subdomains independantly and accept these 50 links per page as an acceptable amount per sub domain. Thanks in advance. Ross
Intermediate & Advanced SEO | | Mulith0 -
Rel canonical element for different URL's
Hello, We have a new client that has several sites with the exact same content. They do this for tracking purposes. We are facing political objections to combine and track differently. Basically, we have no choice but to deal with the situation given. We want to avoid duplicate content issues, and want to SEO only one of the sites. The other sites don't really matter for SEO (they have off-line campaigns pointing to them) we just want one of the sites to get all the credit for the content. My questions: 1. Can we use the rel canonical element on the irrelevent pages/URL's to point to the site we care about? I think I remember Matt Cutts saying this can't be done across URL's. Am I right or wrong? 2. If we can't, what options do I have (without making the client change their entire tracking strategy) to make the site we are SEO'ing the relevant content? Thanks a million! Todd
Intermediate & Advanced SEO | | GravitateOnline0