Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What IP Address does Googlebot use to read your site when coming from an external backlink?
-
Hi All,
I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink.
I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website?
E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address.
Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain?
Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc.
If anyone has any insight this would be great.
-
There's a few things you need to marry up if you want to do this. You need the referring page or domain / hostname (to validate that the session came from a backlink you know about). Once you filter the data down like that, you just need to filter by user-agent ("googlebot" - or any user-agent string which contains "googlebot"). Then you just want to look at the IP address field in the tabular data and you have your answers!
Here's the problem, most IP-level data is contained within basic server-side analysis packages (like AWStats which is installed on most sites, within the cPanel) or alternatively you can go to the log files for much of the same data. Most referrer-level data (stuff that deals with attribution) is contained within Analytics suites like Adobe Omniture or Google Analytics.
In GA, you can't usually get to 'individual' IP-level data. There used to be a URL hack to force it to render, but it was killed off (and many people who used it were banned by Google). The reason for that is, Google don't want too much PID (Personally Identifiable Data) harvested by their tool. It creates too many legal issues for Google (and also, whomever is leveraging that data for potentially nefarious marketing purposes)
Since you won't get enough IP-level data from GA, you're going to have to go to log files and log analysis tools instead. Hopefully they will contain at least some referral level data... The issue is, getting all the pieces you want to align in a legally compliant way
Obviously you have your reasons for looking. I'd check if you can find anything on your CPanel in AWStats (if that's installed) or get the log files and analyse them with something like Screaming Frog Log File Analyser
I can't promise this will return the data you want, but it's probably your only hope
-
Hi,
First of all "Google crawls from many IPs and they have confirmed that they do periodically add new ones. And there are also various Googlebot useragents, not just the regular one. This is why Google doesn't publish a list of all the IPs, because there are so many of them and they can change" .
You can see full conversation here @ https://productforums.google.com/forum/#!msg/webmasters/4fKthSy7oFQ/GgslLXJnDQAJ
Second Today Google says "IP Addresses Don't Matter For Backlinks & Search Rankings"
https://www.seroundtable.com/google-ip-addresses-backlinks-rankings-26561.html
Hope this helps
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Merging Niche Site
I posted a question about this a while ago, but still haven't pulled the trigger. I have a main site (bobsclothing.com). I also have a EM niche site (i.e shirtsmall.com). It would be more efficient for me to merge these site, because: I would have to manage content, promos, etc. on a single site. In other words, I can focus efforts on 1 site. If I am writing content, I don't have to split the work. I don't have to worry about duplicate content. Right now, if I enter a product URL into copyscape, the other sites is returned for many products. What makes me apprehensive are: The niche site actually ranks for more keywords than the main site, although it has lower revenue. Slightly lower PA, and DA. Niche site ranks top 20 for a profitable keyword that has about 1300 exact match searches. If you include the longer tail versions of the keyword it would be more. If I merge these sites, and do proper 301s (product to product, category to category) how likely is it that main site will still rank for that keyword? Am I likely to end up with a site that has stronger DA? Am I better off keeping the niche site and just focusing content efforts on the few keywords that it can rank well for? I appreciate any advice. If someone has done this, please share your experience. TIA
Intermediate & Advanced SEO | | inhouseseo0 -
Blog On Subdomain - Do backlinks to the blog posts on Subdomain count as links for main site?
I want to put blog on my site. The IT department is asking that I use a subdomain (myblog.mysite.com) instead of a subfolder (mysite.com/myblog). I am worried b/c it was my understanding that any links I get to my blog posts (if on subdomain) will not count toward the main site (search engines would view almost as other website). The main purpose of this blog is to attract backlinks. That is why I prefer the subfolder location for the Blog. Can anyone tell me if I am thinking about this right? Another solution I am being offered is to use a reverse proxy. Thoughts? Thank you for your time.
Intermediate & Advanced SEO | | ecerbone0 -
Regional and Global Site
We have numerous versions of what is basically the same site, that targets different countries, such as United States, United Kingdom, South Africa. These websites use Tlds to designate the region, for example, co.uk, co.za I believe this is sufficient (with a little help from Google Webmastertools) to convince the search engines what site is for what region. My question is how do we tell the search engines to send traffic from other regions besides the above to our global site, which would have a .com TLD. For example, we don't have a Brazilian site, how do we drive traffic from Brazil to our global .com site? Many thanks, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Using a lot of "Read More" Hidden text
My site has a LOT of "read more" and when a user click they will see a lot of text. "read more" is dark blue bold and clear to the user. It is the perfect for the user experience, since right below I have pictures and videos which is what most users want. Question: I expect few users will click "Read more" (however, some users will appreciate chance to read and learn more) and I wonder if search engines may think I am hiding text and this is a risky approach or simply discount the text as having zero value from an SEO perspective? Or, equally important: If the text was NOT hidden with a "Read more" would the text actually carry more SEO value than if it is hidden under a "read more" even though users will NOT read the text anyway? If yes, reason may be: when the text is not hidden, search engines cannot see that users are not reading it and the text carry more weight from an SEO perspective than pages where text is hidden under a "Read more" where users rarely click "read more".
Intermediate & Advanced SEO | | khi50 -
Malicious site pointed A-Record to my IP, Google Indexed
Hello All, I launched my site on May 1 and as it turns out, another domain was pointing it's A-Record to my IP. This site is coming up as malicious, but worst of all, it's ranking on keywords for my business objectives with my content and metadata, therefore I'm losing traffic. I've had the domain host remove the incorrect A-Record and I've submitted numerous malware reports to Google, and attempted to request removal of this site from the index. I've resubmitted my sitemap, but it seems as though this offending domain is still being indexed more thoroughly than my legitimate domain. Can anyone offer any advice? Anything would be greatly appreciated! Best regards, Doug
Intermediate & Advanced SEO | | FranGen0 -
Using Folkd for Video Backlink
Hi Mozzers, What are your thoughts on using www.folkd.com for video SEO? We have a few company videos and would like to possibly get a backlink by either embedding one of our youtube videos on our site or self hosting the video. Are bookmarking sites like this spammy?
Intermediate & Advanced SEO | | Travis-W0 -
Can Google read my backlink in Javascript??
Hi SeoMoz community! I have a software product, which our clients implement onto their websites. It is like a pop up box. I know that backlinks are very important for SEO ranking, and I really want to give our clients 2 options of product: 1. you can get the free/cheaper option if you use the code which has a keyworded backlink to our site on it 2. you can pay small fee if you don't want to use the version with a link to our site on it Now, the problem is that the product is written entirely in Javascript, and I don't think that Google crawls this, do they? Is there a way around this? Thanks for your help!
Intermediate & Advanced SEO | | qdigi0