Is the link data from Open Site Explorer in real time or an average?
-
I just started using Open Site Explorer to track internal and external link data. Is this information given in real time or is it an average over a specified period of time?
-
Hi Laura,
Your answer could be found in the About section of the Open Site Explorer:
"We update our Linkscape Index every 4 weeks. Crawling the entire Internet to look for links takes 2-3 weeks, but our crawlers are always in motion. When we need to start processing, we grab all the data they have collected and start processing which can take up to 3 weeks to determine which of those links are the most important.".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to protect the site from fake traffic
How to protect the site from fake traffic On Google Analytic there are no visits, but on the front of the site there are 6000 jetpack wordpress statistique https://arabtechnologie.com/
Reporting & Analytics | | BELGHOUL0 -
Huge Decline in Links
Good Morning Everyone, Looking for some feedback as to why all of my Backlink metrics might be way down (as well as rankings)... Please see the details below that show some of the metrics from MOZ reports from August and from October. Does anyone know why these metrics are all so down? We have not done any link removal exercises or anything that would cause this drop --- please let me know if there is any insight any of you have as to what is the reason for this drop. Thanks Linking C Blocks August 3: 124 October 14: 23 External Followed Links August 3: 4486 October 14: 1558 Total External Links August 3: 4795 October 14: 1680 Total Links August 3: 21338 October 14: 17809 Followed Linking Root Domains August 3: 323 October 14: 116 Total Linking Root Domains August 3: 442 October 14: 143
Reporting & Analytics | | Prime850 -
Best to Leave Toxic Links or Remove/Disovow on Site with Low Number of Linking Domains
Our site has only 87 referring domains (with at least 7,100 incoming links). LinkDetox has identified 29% of our back links as being toxic and 14% as being questionable. Virtually all of these links derive from spammy sites. We never received a manual penalty, but ever since the first Penguin penalty in 2012 our search volume and ranking has dropped with some uneven recover in the last 3 years. By removing/disavowing toxic links are we risking that over optimized link text will be removed and that ranking will suffer as a result? Are we potentially shooting ourselves in the foot? Would we be better to spend a few months building quality links from reputable domains before removing disavowing bad links? Or toxic links (as defined by LinkDetox) so bad that it should be a priority to remove them immediately before taking any other step? Thanks, Alan
Reporting & Analytics | | Kingalan10 -
Difference between site: search and Total Indexed in Google Webmaster Tools.
This morning I did a search on Google for my site using the site: operator. I noticed that the number of results returned was significantly different than the "Total indexed" in Google Webmaster Tools. What is the difference and is it normal to have two very different numbers here?
Reporting & Analytics | | Gordian0 -
Problem with Enhanced Link Attribution
I set up Enhanced Link Attribution yestarday on two my websites, but it still doesn´t work. When I look at numbers of clicks at In-page analytics I see the same numbers of clicks for example on heading, read more button, thumbail of blogposts - so it doesn´t work. My GA code: <script type="<a class="attribute-value">text/javascript</a>"> var _gaq = _gaq || []; var pluginUrl = '//www.google-analytics.com/plugins/ga/inpage_linkid.js'; _gaq.push(['_require', 'inpage_linkid', pluginUrl]); _gaq.push(['_setAccount', 'XXXX']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); script> Could you help me? thanks 🙂
Reporting & Analytics | | mysho0 -
Confirmation page gets hit multiple times by some users. How I can I segment out unique visits?
Hi All, I'm web marketing manager at http://www.evenues.com which is like an AirBnB for meeting space. When calculating the number of bookings for our meeting spaces, I've set up a goal in analytics with the confirmation page as the goal URL. The problem is, it seems that some users are looking at the same confirmation page several times. We have unique URLs for each confirmation page, but some users seem to be visiting these unique pages more than 2 to 5 times. This skews our numbers a bit. This makes things a bit problematic when it comes to segmenting visitors. is there anything we can so that each unique URL visited only counts once? Thanks, Kenji
Reporting & Analytics | | eVenuesSEO0 -
Subdomain and relative link paths cause crawl errors
I have a Wordpress blog on our subdomain and we use relative paths on our domain. It appears as though Google bot is crawling from the subdomain categories back to the domain relative paths. This of course results in hundreds of 404 pages. Any suggestions as to how to resolve this issue without changing the relative path structure of our domain? I can provide more information if need be. While I realize these issues are not that pressing, I'd obviously like to remove as many errors as possible. If anyone has encountered this problem, especially in Wordpress I'd really like to hear your solution or lack there of. Thank you in advance.
Reporting & Analytics | | BethA0 -
Spider 404 errors linked to purchased domain
Hi, My client purchased a domain which based on the seller "promising lots of traffic". Subsequent investigation showed it was a scam and that the seller had been creative in Photoshop with some GA reports. Nevertheless, my client had redirected the acquired domain to their primary domain (via the domain registrar). From the period on which the acquired domain was redirected to the point when we removed the redirect, the web log files had a high volume of spider/bot 404 errors relating to an online pharmaacy - viagra, pills etc. The account does not seem to have been hacked. No additional files are present and the rest of the logs seem normal. As soon as the redirect was removed the spider 404 errors stopped. Aside from the advice about acquiring domains promising traffic which I've already discussed with my client, does anybody have any ideas about how a redirect could cause the 404 errors? Thanks
Reporting & Analytics | | bjalc20110