Updating Meta Tag error quickly besides submit to index in Webmaster Tools
-
For a conference page marketing built the meta tag didn't have correct year and date of the conference. I updated and used webmaster tools submit to index to try and get it updated in google search quickly but meta tag has not updated.
Are there other avenues to get this corrected?
-
That is a great question. We have a very active site we added some new info into meta desc about 8weeks ago. One search (not a very large # search on that phrase) still has the old info while all others have new. I believe it is due to # of servers, etc. with Google.
-
You may also try to build some links into the page, as well as sharing it on social networks, particularly Google+, to get the big G to recrawl and revisit the page and update the meta info it is displaying.
Good luck,
Mark
-
Thanks, so when does a meta tage change usually begin to show up in SERP on all searches?
-
Ralph,
In WMT, assuming this is not what you meant, you can use fetch as Google on the specific url and it may help. There is no guarantee and I have noted before that under certain searches, you may still see an older meta description snippet in the SERPs. (Note, this is not the real intent of Fetch as Google, it is for checking issues on the site as they relate to crawlability and it will not follow redirects.)
You can also resubmit the Sitemap.
Here is Fetch as Google from GWMT in case you don't have handy.
Hope this helps you out,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
Pages Indexed Not Changing
I have several sites that I do SEO for that are having a common problem. I have submitted xml sitemaps to Google for each site, and as new pages are added to the site, they are added to the xml sitemap. To make sure new pages are being indexed, I check the number of pages that have been indexed vs. the number of pages submitted by the xml sitemap every week. For weeks now, the number of pages submitted has increased, but the number of pages actually indexed has not changed. I have done searches on Google for the new pages and they are always added to the index, but the number of indexed pages is still not changing. My initial thought was as new pages are added to the index, old ones are being dropped. But I can't find evidence of that, or understand why that would be the case. Any ideas on why this is happening? Or am I worrying about something that I shouldn't even be concerned with since new pages are being indexed?
Technical SEO | | ang1 -
How do crawl errors from SEOmoz tool set effect rankings?
Hello - The other day I presented the crawl diagnostic report to a client. We identified duplicate page title errors, missing meta description errors, and duplicate content errors. After reviewing the report we presented it to the clients web company who operates a closed source CMS. Their response was that these errors are not worthy of fixing and in fact they are not hurting the site. We are having issues getting the errors fixed and I would like your opinion on this matter. My question is, how bad are these errors? Should we not fix them? Should they be fixed? Will fixing the errors have an impact on our site's rankings? Personally, I think the question is silly. I mean, the errors were found using the SEOmoz tool kit, these errors have to be effecting SEO.....right? The attached image is the result of the Crawl Diagnostics that crawled 1,400 pages. NOTE: Most of the errors are coming from Pages like blog/archive/2011-07/page-2 /blog/category/xxxxx-xxxxxx-xxxxxxx/page-2 testimonials/147/xxxxx--xxxxx (xxxx represents information unique to the client) Thanks for your insight! c9Q33.png
Technical SEO | | Gabe0 -
Supplementary Index
Hi - Is there a way of checking whether pages are in the supplementary index? Thanks
Technical SEO | | bjalc20110 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
Remove Deleted (but indexed) Pages Through Webmaster Tools?
I run a blog/directory site. Recently, I changed directory software and, as a result, Google is showing 404 Not Found crawling errors for about 750 non-existent pages. I've had some suggest that I should implement a 301 redirect, but can't see the wisdom in this as the pages are obscure, unlikely to appear in search and they've been deleted. Is the best course to simply manually enter each 404 error page in to the Remove Page option in Webmaster Tools? Will entering deleted pages into the Removal area hurt other healthy pages on my site?
Technical SEO | | JSOC0 -
REL Canonical Error
In my crawl diagnostics it showing a Rel=Canonical error on almost every page. I'm using wordpress. Is there a default wordpress problem that would cause this?
Technical SEO | | mmaes0 -
Canonical Tag
Does it do anything to place the Canonical tag on the unique page itself? I thought this was only to be used on the offending pages that are the copies. Thanks
Technical SEO | | poolguy0