Data highlighter in WMT displays old version of page
-
I want to mark up a business address for Google Local, so I thought I would use the data highlighter in WMT. However I only just added the address to the the bottom of the home page and when using data highlighter iit is giving me the old version of page to mark up without the address on.
Rather frustrating, does any body have any experience on the time frame until Google updates the page in the data highlighter?
According to this thread it's not even related to the page re caching: Data Highlighter: Start link is pulling an old version of page
-
OK I just checked it and it is now updated to the correct preview.
So it took up to approx 19 hours to change. Although bear in mind I wasn't checking it all the time, could be faster. Also note, if you have started highlighting the old version, when you go back into the saved one it will still have the old preview, so you need to start over again.
I can also confirm the preview in the data highlighter is not connected to Google's cache of the page in the index, as the old version of the page is still cached.
-
Thanks for the tips Thomas. I had considered doing it 'manually' but wanted to experiment with the data highlighter tool.
I'll keep on eye on it and report back the time it took Google to update to the correct preview.
-
The only thing I can think of is Google is showing you what it last indexed and that does sound strange. However there is a other way to put your address on schema properly and quickly so you'll get the local search results you want.
Use this tool http://www.feedthebot.com/tools/address/
it's 100% free and has a lot of extra tools connected that are great
You can also utilize
If you prefer to use micro data which is almost the same thing as schema you can use this tool.
http://www.microdatagenerator.com/
However I have been told not to mix the two causes some issues with search engines. So take schema or micro data is what I have recently been told I have been trying to get a solid confirmation so I think it's a possibility that it would make sense but don't want to tell you something that is not true.
My $.02 use the 1st tool and it will do the job just fine.
It is in the form of a Word press plug-in but also gives you the ability to create schema correctly right on the site and paste it into your code.
The nice thing about it is there's a little box to the right that gives you a exact match of what it's going to look like on your site.
If you don't want it to look like it formatted anything I would use the 1st tool. However both of them are excellent.
One last thing if you're using WordPress consider Yoast Local SEO it seems expensive but does a fantastic job
More great sources of information
http://www.searchenginejournal.com/how-to-use-schema-markup-for-local-seo/
http://searchengineland.com/13-semantic-markup-tips-for-2013-a-local-seo-checklist-143708
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does integration of external supplemenatry data help or hurt regarding googles perception of content quality? (e.g weather info, climate table, population info, currency exchange data via API or open source databases)
We just lost over 20% traffic after google algo update at June 26.
Intermediate & Advanced SEO | | lcourse
In SEO forums people guess that there was likely a Phantom update or maybe a Panda update. The most common advice I found was adding more unique content. While we have already unique proprietary content on all our pages and we plan to add more, I was also considering to add some content from external sources. Our site is travel related so I thought about adding for each city page external data such as weather, climate data, currency exchange data via APIs from external sources and also some data such as population from open source databases or some statistical info we would search on the web. I believe this data would be useful to the visitors. I understand that purely own content would be ideal and we will work on this as well. Any thoughts? Do you think the external data may rather help or hurt how google perceives content quality?0 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Website Re-Launch - New URLS / Old URL WMT
Hello... We recently re-launched website with a new CMS (Magento). We kept the same domain name, however most of the structure changed. We were diligent about inputting the 301 redirects. The domain is over 15 years old and has tons of link equity and history. Today marks 27 days since launch...And Google Webmaster Tools showed me a recently detected (dated two days ago) URL from the old structure. Our natural search traffic has take a slow dive since launch...Any thoughts? Some background info: The old site did not have a sitemap.xml. The relaunched site does. Thanks!
Intermediate & Advanced SEO | | 19prince0 -
Interlinking vs. 'orphaning' mobile page versions in a dynamic serving scenario
Hi there, I'd love to get the Moz community's take on this. We are working on setting up dynamic serving for mobile versions of our pages. During the process of planning the mobile version of a page, we identified a type of navigational links that, while useful enough for desktop visitors, we feel would not be as useful to mobile visitors. We would like to remove these from our mobile version of the page as part of offering a more streamlined mobile page. So we feel that we're making a fine decision with user experience in mind. On any single page, the number of links removed in the mobile version would be relatively few. The question is: is there any danger in “orphaning” the mobile versions of certain pages because links don’t exist pointing to those pages on our mobile pages? Is this a legitimate concern, or is it enough that none of the desktop versions of pages are orphaned? We were not sure whether it’s even possible, in Googlebot’s eyes, to orphan a mobile version of a page if we use dynamic serving and if there are no orphaned desktop versions of our pages. (We also plan to link to "full site" in the footer.) Thank you in advance for your help,
Intermediate & Advanced SEO | | Eric_R
Eric0 -
Is it OK to Delete a Page and Move Content to a Another Page without 301 re-direct
I have a page "A" that I want to completely delete and move the written content from A" to page "B". Since I am deleting "A" (not keeping page) is it OK to upload the content from "A" to page "B" and search engines will give "B" credit for the unique content? Or, since the content has already once been indexed on "A", "B" may struggle to get full credit for this new unique content, even though page "A" is deleted?
Intermediate & Advanced SEO | | khi50 -
Can a home page penalty cause a drop in rankings for all pages?
All my main keywords have dropped out of the SERPS. Could it be that the home page (the strongest) page has been devalued and therefore 'link juice' that used to spread throughout the site is no longer doing so. Would this cause all other pages to drop? I just can't understand how all my pages have lost rankings. The site is still indexed so there's no problem there.
Intermediate & Advanced SEO | | SamCUK0 -
Removing hundreds of old product pages - Best process
Hi guys, I've got a site about discounts/specials etc. A few months ago we decided it might be useful to have shop specials in PDF documents "pulled" and put on the site individually so that people could find the specials easily. This resulted in over 2000 new pages being added to the site over a few weeks (there are lots of specials).
Intermediate & Advanced SEO | | cashchampion
However, 2 things have happened: 1 - we have decided to go in another direction with the site and are no longer doing this
2 - the specials that were uploaded have now ended but the pages are still live Google has indexed these pages already. What would be the best way to "deal" with these pages? Do I just delete them, do I 301 them to the home page? PS the site is build on wordpress. Any ideas as I am at a complete loss. Thanks,
Marc0 -
Pages On Subfolder Not Ranking
A subdirectory/folder on our website doesn't seem to rank for any keywords where the same type of pages on the same competition level keywords rank perfectly fine. For awhile the pages weren't getting indexed but were crawled regularly. Can't seem to figure the problem out.
Intermediate & Advanced SEO | | bprimeelitellc0