Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best and easiest Google Depersonalization method
-
Hello,
Moz hasn't written anything about depersonalization for years. This article has methods, but I don't know if they are valid anymore.
What's an easy, effective way to depersonalize Google search these days? I would just log out of Google, but that shows different ranking results than Moz's rank tracker for one of our main keywords, so I don't know if that method is correct.
Thanks
-
Thanks Rand, really appreciate it!
-
Hi Rand,
Thanks for jumping in and helping us all out. Your response is much appreciated.
Regards,
Vijay
-
I'm surprised at how well this still works, but it does:
- Use an incognito browser window to remove account personalization
- Use a query string like this: https://google.co.nz/search?q=your+keyword+terms&gl=us
With 2) above, you're removing the geographic bias of any particular region/IP address by searching in Google New Zealand, then re-geo-locating the search to the US. This will give you non-geo-biased results.
If you want to see how specific results look from a particular region, there's two semi decent options:
A) Use Google's Ad Preview Tool: https://adwords.google.com/apt/anon/AdPreview?__u=1000000000&__c=1000000000
B) Use the &near parameter, e.g. https://google.co.nz/search?q=your+keyword+terms&gl=us&near=seattle+wa -
Yes, this is one of many factors for de-personalization. Also, there can be many more hidden factors which we are yet to discover.
I have done a lot of research on this matter, I use a specific PC with VPN dedicated to checking keyword SERP ranks for my clients, as they are from many different countries and having a different target audience, we try to replicate the results for different scenarios.
I hope this helps.
-
So am I correct that logging out and adding &pws=0 is not enough?
-
Hi There,
In addition to the methods suggested for de-personalization of results, there are additional few more factors.You may also like to read a blog post I wrote on my website Impact of Personalized Search results .
Incognito window doesn't mean you have deleted history of the previous browsing, you will have to clean the browsing history and cookies.
Use VPN or proxy to get results from different locations and countries. This gives you best Idea about your SERP status in different countries.
I hope this helps, please feel free to ask more questions by responding.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does google takes to crawl a single site ?
lately i have been thinking , when a crawler visits an already visited site or indexed site, whats the duration of its scanning?
Algorithm Updates | | Sam09schulz0 -
Best way to handle outdated & years old Blog-posts?
Hi all, We have almost 1000 pages or posts from our blog which are indexed in Google. Few of them are years old, but they have some relevant and credible content which appears in search results. I am just worried about other hundreds of non-relevant posts which are years old. Being hosting hundreds of them, our website is holding lots of these useless indexing pages which might be giving us little negative impact of keeping non-ranking pages. What's the best way to handle them? Are these pages Okay? Or must be non-indexed or deleted? Thanks
Algorithm Updates | | vtmoz0 -
Google Index
Hi all, I just submit my url and linked pages along with xml map to index. How long does it take google to index my new pages?
Algorithm Updates | | businessowner0 -
Is it possible that Google may have erroneous indexing dates?
I am consulting someone for a problem related to copied content. Both sites in question are WordPress (self hosted) sites. The "good" site publishes a post. The "bad" site copies the post (without even removing all internal links to the "good" site) a few days after. On both websites it is obvious the publishing date of the posts, and it is clear that the "bad" site publishes the posts days later. The content thief doesn't even bother to fake the publishing date. The owner of the "good" site wants to have all the proofs needed before acting against the content thief. So I suggested him to also check in Google the dates the various pages were indexed using Search Tools -> Custom Range in order to have the indexing date displayed next to the search results. For all of the copied pages the indexing dates also prove the "bad" site published the content days after the "good" site, but there are 2 exceptions for the very 2 first posts copied. First post:
Algorithm Updates | | SorinaDascalu
On the "good" website it was published on 30 January 2013
On the "bad" website it was published on 26 February 2013
In Google search both show up indexed on 30 January 2013! Second post:
On the "good" website it was published on 20 March 2013
On the "bad" website it was published on 10 May 2013
In Google search both show up indexed on 20 March 2013! Is it possible to be an error in the date shown in Google search results? I also asked for help on Google Webmaster forums but there the discussion shifted to "who copied the content" and "file a DMCA complain". So I want to be sure my question is better understood here.
It is not about who published the content first or how to take down the copied content, I am just asking if anybody else noticed this strange thing with Google indexing dates. How is it possible for Google search results to display an indexing date previous to the date the article copy was published and exactly the same date that the original article was published and indexed?0 -
Does a KML file have to be indexed by Google?
I'm currently using the Yoast Local SEO plugin for WordPress to generate my KML file which is linked to from the GeoSitemap. Check it out http://www.holycitycatering.com/sitemap_index.xml. A competitor of mine just told me that this isn't correct and that the link to the KML should be a downloadable file that's indexed in Google. This is the opposite of what Yoast is saying... "He's wrong. 🙂 And the KML isn't a file, it's being rendered. You wouldn't want it to be indexed anyway, you just want Google to find the information in there. What is the best way to create a KML? Should it be indexed?
Algorithm Updates | | projectassistant1 -
Proper Way To Submit A Reconsideration Request To Google
Hello, In previous posts, I was speaking about how we were penalized by Google for unnatural links. Basically 50,000 our of our 58,000 links were coming from 4-5 sites with the same exact anchor text and img alt tags. This obviously was causing our issues. Needless to say, I wen through the complete link profile to determine that all of the links besides this were of natrural origins. My question here is what is the accepted protocol of submitting a reinclusion request; For example, how long should it be? Should I disclose that I was in fact using paid links, and now that I removed (or at least nofollowed) them? I want to make sure that the request as good as it should so I can get our rankings up in a timely manner. Also, how long until the request is typically aknowledged? Thanks
Algorithm Updates | | BestOdds0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Home page replaced by subpage in google SERP (good or bad)
SInce Panda, We have seen our home page drop from #2 in google.ie serp to page 3 but it has been replaced in the same position @#2 by our relevent sub page for the keyword that we ranked#2 for. Is this a good or bad thing from and seo point of view and is it better to have deep pages show in serp rather than the homepage of a site and what is the best line of action from here in relation to seo. Is it best to work on subpage or home page for that keyword and should link building for that phrase be directed towards the subpage or the homepage as the subpage is obviously more relevent in googles eyes for the search term. It is clear that all areas of the site should be looked at in relation to link building and deep links etc but now that google is obviously looking at relevancy very closely should all campaigns be sectioned into relevent content managed sections and the site likewise and treated on an individual basis. Any help that you may have would be very welcome. Paul
Algorithm Updates | | mcintyr0