How to report rankings after the Google Venice update?
-
As a profesional agency we focus on traffic and conversions, but rankings are still a good KPI to please customers. Unfortunately rankings are not reliable anymore sinds the Google Venice update.
My question is;
"How can you still report about rankings, but without the risk that your customer sees total different results?"
Software we use
At the moment we use Rank Tracker from Link Assistant.
-
Hi David,
Thanks for your fast reply on my question!
We noticed that Venice is overruling depersonalisation search. Even if our customers use &pws=0 or incognito mode they sill get local results. This is not only annoying for our customers, but also for us as SEO professionals.
How can we tackle this problem?
-
Hi Van,
Unfortunately it's near on impossible to obtain a ranking position that will show universally for yourselves, your clients and their customers.
The best option you have is spending the time getting your customers to truly understand personalised and localised search, as well as the various methods of turning personalisation off. Once they understand this, they will be on board with the fact that the results they see will differ nationally/globally.
I work with a large number of clients on this basis, and they are happy with the logic. The know they can turn personalisation off (to an extent) using '&pws=0' or incognito mode, and to be honest these methods usually match pretty spot on with the rank tracking I'm providing them (probably because the rank checkers obtaining the results using these de-personalised methods)
Person A is always going to see different results than Person B, but if your clients can understand that the rankings reports are providing the positions the majority will see, then they should be happy.
As you mentioned, the main metrics should always be visitors and sales, but tying these to rankings is always a great way of showing the ROI of their natural search efforts.
Best of luck.
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can i discover how many of my pages have been indexed by google?
I am currently in the process of trying to produce a report for my corporation and this is a metric that i cannot seem to find on OpenSiteExplorer. Could anyone help?
Industry News | | CF20150 -
So, Google is the best site on the internet.. Right? Or is that just what most people tend to think off-face?
LOL woah, put the guns away. I'm not about to rant, I just have a question and wanted to present it well. Then again, I might have actually found some easy fixes to some of Google's tools that they could make. So here's the thing. I noticed how annoyed I always getting when I have to sign in every time I go to the adwords keyword tool, or analytics. Why do you have to sign in a million times? I think it is a problem that can be fixed because if you go to check your webmaster tools, you go straight into your account, where you can then select which site you want to explore. It knows that I am already signed in to Google Accounts when I go to webmaster tools, but it doesn't recognize that fact when I go to my Analytics account, or to use the Adwords Keyword Tool. Now, every site has things that they need to work on, but not necessarily that need to be 'fixed'. Google being so commonly accepted as the best site on the net, I thought it was funny/interesting at the least to point out the problem. Even funnier is the fact that I could submit it as a problem to see if they could fix it or not, but they do such a good job of making it hard for people to contact them, that A) I don't feel like wasting my time trying B) I don't even really know if it is possible to do that. Also, why is there no official Google Analytics App / Mobile site?? Google has been pushing how important mobile is to us webmasters, but then it doesn't seem to be very high on their priority list for the tools that we use. I mean you can't view graphs on phones / tablets (mine at least), in webmaster tools, OR google analytics. Also, its a pain in the but to click the sign in button on Google Analytics when using my phone / tablet, it disapears really fast for me (needs more research from others to see if everyone has the same problem) Thanks for the interest / answers everybody. Look forward to hearing from you guys. Also, tips and help would be nice if anybody knows a solution to my sign in issue
Industry News | | TylerAbernethy0 -
About Ranking
I have a question regarding keywords and ranking I am running a campaign using "Conference venues Park Lane" as one of the keyword. I am already in first place google with that keywords but not the the right landing page. How can I fix that. Google is sending to the wedding landing page and not the conference one. Any idea?
Industry News | | lnietob0 -
Sudden drop in keyword ranking
Since last Friday (09/28/12) all of my keyword ranking is gone. I checked the webmaster and found, among the others, one error "access denied" "1 error found" "server requires login or is blocking googlebot" Im trying to see if this thing has to do with sudden drop of ranking? I have other sites on the webmaster and none of them is showing this type of error. Help will be appreciated. When I check individual urls they are there (indexed).
Industry News | | tanveer10 -
Google Search Quality Team - Commission Based Reviews
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report. Some were members of the Google Search Quality Team. I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked. May be I have been naive for all these years but I didn't realize that; Google outsourced the review and reconsideration requests to individual reviewers for a compensation Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best, The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them. We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh. Obviously I don't want to post said document online here.... BUT, I wanted to know if: a) any Mozzers had ever been part of such a group - the GSQT b) had any dealings with them - in terms of having your website reviewed and known about it. I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers. Please don't get me wrong here... totally on board with manual reviews... I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword? What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick. I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done. When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients. Carlos
Industry News | | CarlosFernandes0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
How long after making changes will position on Google be altered?
I'm curious as to how long Google updates take these days? I'm just getting back into SEO after 9 years and I recall back in the day there was a monthly "dance" during which page results were updated. Is it more frequent now? Thanks
Industry News | | celife0 -
How to achieve the highest global and local relevance in google?
Let's say I have a company that has its main business in Europe for thefollowing languages: English German Portugese French Italian And let's say some other markets (e.g. the Portugese one in south america) is also important. The question now is how should we structure the Domain if we want onlyone top level domain (www.company.com)? a) By using subdomains to target users with Google Webmaster Tools for the relevant country: portugal.company.com/pt (same content) brasil.company.com/pt (same content) germany.company.com/de england.company.com/en etc. or b) by using virtual folders www.company.com/pt www.company.com/de www.company.com/en
Industry News | | imsi
etc. or c) something completely different I do not know about? What do you reckon is best? I appreciate all suggestions!0