Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best and easiest Google Depersonalization method
-
Hello,
Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore.
What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct.
Thanks
-
Thanks Rand, really appreciate it!
-
Hi Rand,
Thanks for jumping in and helping us all out. Your response is much appreciated.
Regards,
Vijay
-
I'm surprised at how well this still works, but it does:
- Use an incognito browser window to remove account personalization
- Use a query string like this: https://google.co.nz/search?q=your+keyword+terms&gl=us
With 2) above, you're removing the geographic bias of any particular region/IP address by searching in Google New Zealand, then re-geo-locating the search to the US. This will give you non-geo-biased results.
If you want to see how specific results look from a particular region, there's two semi decent options:
A) Use Google's Ad Preview Tool: https://adwords.google.com/apt/anon/AdPreview?__u=1000000000&__c=1000000000
B) Use the &near parameter, e.g. https://google.co.nz/search?q=your+keyword+terms&gl=us&near=seattle+wa -
Yes, this is one of many factors for de-personalization. Also, there can be many more hidden factors which we are yet to discover.
I have done a lot of research on this matter, I use a specific PC with VPN dedicated to checking keyword SERP ranks for my clients, as they are from many different countries and having a different target audience, we try to replicate the results for different scenarios.
I hope this helps.
-
So am I correct that logging out and adding &pws=0 is not enough?
-
Hi There,
In addition to the methods suggested for de-personalization of results, there are additional few more factors.You may also like to read a blog post I wrote on my website Impact of Personalized Search results .
Incognito window doesn't mean you have deleted history of the previous browsing, you will have to clean the browsing history and cookies.
Use VPN or proxy to get results from different locations and countries. This gives you best Idea about your SERP status in different countries.
I hope this helps, please feel free to ask more questions by responding.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Key webpage fluctuating between page 2 and page 6 of Google SERP
Hi, We have found that one of our key webpages has been fluctuating between page 2 and page 6 of Google SERP for around 2 weeks. Some days it will be on page 6 in the morning and then page 2 in the afternoon. We have recently updated some copy on the page and wondered if this could be the cause. Has anyone else experienced this? If so how long was it before the page settled? https://www.mrisoftware.com/uk/products/property-management-software/ Thanks.
Algorithm Updates | | nfrank0 -
Best way to handle outdated & years old Blog-posts?
Hi all, We have almost 1000 pages or posts from our blog which are indexed in Google. Few of them are years old, but they have some relevant and credible content which appears in search results. I am just worried about other hundreds of non-relevant posts which are years old. Being hosting hundreds of them, our website is holding lots of these useless indexing pages which might be giving us little negative impact of keeping non-ranking pages. What's the best way to handle them? Are these pages Okay? Or must be non-indexed or deleted? Thanks
Algorithm Updates | | vtmoz0 -
More pages or less pages for best SEO practices?
Hi all, I would like to know the community's opinion on this. A website with more pages or less pages will rank better? Websites with more pages have an advantage of more landing pages for targeted keywords. Less pages will have advantage of holding up page rank with limited pages which might impact in better ranking of pages. I know this is highly dependent. I mean to get answers for an ideal website. Thanks,
Algorithm Updates | | vtmoz1 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Best Place To Aggregate Customer Reviews?
There is a plethora of choice today for aggregating customer reviews. BBB wants to collect ratings from our customers; there a commercial options that put me in more control; yelp wants to rate me locally. Google... Given that I have some ability to steer customers toward my preferred review site, which one should I use? From an SEO perspective, have you noticed whether some these sites carry more weight or credibility than others? I realize it may depend on the type of business we are. In this case I represent a software company and their customers are USA/global, rather than local.
Algorithm Updates | | DarrenX0 -
Proper Way To Submit A Reconsideration Request To Google
Hello, In previous posts, I was speaking about how we were penalized by Google for unnatural links. Basically 50,000 our of our 58,000 links were coming from 4-5 sites with the same exact anchor text and img alt tags. This obviously was causing our issues. Needless to say, I wen through the complete link profile to determine that all of the links besides this were of natrural origins. My question here is what is the accepted protocol of submitting a reinclusion request; For example, how long should it be? Should I disclose that I was in fact using paid links, and now that I removed (or at least nofollowed) them? I want to make sure that the request as good as it should so I can get our rankings up in a timely manner. Also, how long until the request is typically aknowledged? Thanks
Algorithm Updates | | BestOdds0 -
Lost 50% google traffic in one day - panic?
Hi girls + guys, a site of us were hit by a google update or a google penalty. We have lost 50% google traffic in one day (25th april, 2012). (Total visitors in average per day: 6k, yesterday: 3k) It's a german website, so I think google.de (germany) was updated. Our rankings in google.at (austria) are also affected, but it's not that bad as in google.de. We have not done any specific on page seo activities in the last two months. GWT doesn't have any message for us (no critical errors). After my first analyse I can say this: google has indexed 17k pages (thats fine) we are on 1st place with our domain name the last three days, the google traffic went up (+20%), but yesterday it was 50% below average (so -70%) last week we had a very good day, we had twice the traffic than normal, but this calmed down the following days we have lost number no. 1 places at two high traffic keywords. We had these no 1 rankings for years. We have been outranked by two of our competitors, but they have not done any onpage changes. We have lost a lot of positions at a lot of keywords. But there are also keywords which moved up. We have good content, useres are visiting 5 pages in average. No virus, no hacker (no hidden cloaking page) it's an old domain (2002) Lot of (good) inbound links Lot's of likes, g+. Good twitter activty. So, all in all I think it's more likely a ranking algo change than a penalty (a penalty for what reason?) My specific question(s): Is there any "check list" which could help me to find out the reason for this mess? What is the best strategy to regain the positions? New HTML code? New On page seo? (seomoz grades most of our important pages an A) Any idea would be appreciated! Best wishes,
Algorithm Updates | | GeorgFranz
Georg.1