Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
-
Hi,
We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc.
The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them.
As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent.
The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is
1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it).
2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors).
Many thanks in advance
-
You might be better with a server side tracker like http://awstats.sourceforge.net/
The answer from Mat probably has the best logic, but the only problem is are you legally responsible for mitigating the possibility of fraud?
I would make sure you add this to the contract, as I am not sure you are going to be able to defeat a proxy or spoofer, just in case the referrer gets smart and decides to work the system.
An anti fraud system can be put into place, but LOL I am not sure you will have the access to the multi million dollar fraud monitoring tools that Google does, that are contstantly updated and algorithmically and systematically monitor as well as have auditors who manually do random checks...
-
Hi - Well we are really just acting on behalf of the client - that's what they want.
Also its only visitors from that specific website (very close niche) - not just any site
-
Google Analytics doesn't report IP Address though - which is another reason to take a different root. Not knocking GA, I love it. However it isn't the right tools for this.
I suspect that the fiverr gigs use ping or something the create the mass of "unique visits". Very easy to spot. Unless you have some fairly sophisticated tools to hand i'd imagine that any method that can deliver 5000 for $5 is going to be pretty easy to spot.
Might try it now though. I love fiverr for testing stuff
-
If you must use Analytics, I would drill down to the source of referral within analytics. This will give you the URL, page, or whatever. I think you can also drill down to the referring IP etc...
You need to log were they come from through them. Export your results every month and see a pattern.
If you get 500 referrals from website B's IP or URL, then its a sure way of knowing they are throwing people at you.
But Mats answer is best, will give you times, not just dates and will also give you more detailed info.
-
My question is: is unique visitors the right metric that you should be measuring? On Fiverr.com I can get 2000 to 10,000 unique visitors for $5. http://fiverr.com/gigs/search?query=unique+visitors&x=0&y=0
Can you tie your metrics to something else that might have more value for you, such as purchases, newsletter signups (still easy to fake, but at least takes a little more time), etc?
-
Google Analytics isn't designed to pull the data in the way you really want to for something like this. It can be done I suppose, but it'd be hard work.
There are only so many metrics you can measure, and all are pretty easy to fake. However having the data is an easy to access form means that you can spot patterns and behaviour, which are much harder to fake.
Probably a starting point would be to measure distribution of the various metrics on the referred traffic v the general trend. If one particular C class block (or user agent, or resolution, or operating system, or whatever) appeared at a different frequency in the paid traffic that would be a good place to look deeper.
Thinking less technically for a moment though, I bet you could just implement one of the many anti click fraud systems to do most of this for you. same idea, but someone else has already done the coding. Googling for click fraud brings up a stack of ads (tempting to click them loads and set off their alarms!!).
-
Hi Mat,
A very informative answer.
If someone is going to try and spoof analytics, then would they not also be able to equally try and fool the script?
If someone was to try this do you know how they would likely try and do it - essentially if I know what is likely to be tried, then I can work out something that could counteract it. Are there certain things that can't be fooled, or are very difficult to fool ? - EG things like browser resolution, location etc - or are this just as easy to spoof as anything else?
many thanks
-
It isn't hard to fake this at all I am afraid. Spotting it will depend on how sophisticated the person doing it is.
My personal preference would be not to use analytics as the means of counting it. Doing that you are going to be slightly limited in the metrics you have available and will always be "correcting" data and looking for problems rather than measuring more correctly and having problems spotted.
I'd have a script on page that logs that checks for a referrer and it if matches the pattern for website B creates a log record instead.
You then have the ability to set your rules. For instance if you get 2 referrals from the same IP a second apart would you count them? What about 10 per hour 24 hours a day? You can also log the exact timestamp with whatever variables you want to collect, so each click from the referring site might be recorded as:
- Time stamp
- Exact referring URL
- User agent
- IP
- Last visit (based on cookie)
- Total visits (based on cookie)
- #pages viewed (updating cookie on subsequent page views )
- and so on
Analytics doesn't give you access to the data in quite the same way. I'd definitely want to be logging it myself if the money involved is reasonable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Site with Multiple Domains?
Hey there! My client has a website on Shopify. I don't even know how to open this can of worms, but let me try. The site URL is: https://mobilityequipmentforless.com/ However, there is another (older?) URL that gets updated as the main site gets updated and shows the exact same content. It's a straight duplicate, but is it's own URL and doesn't redirect to the main site. https://www.powerchairrecyclers.com/ And this isn't the SITE.Shopify back-end site name that was used for set up initially. I just have no idea what's going on here. Not sure if it's a serious error that needs to be fixed, or if it's something weird with how Shopify work. Any insight would be immensely helpful. Thanks! Mike
Intermediate & Advanced SEO | | naturalsociety0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Do I have to optimize every page on my site?
Hi guys I run my own photography webstie (www.hemeravisuals.co.uk Going through the process optimizing my page for seo. I have one question I have a few gallery pages with no text etc? Do I still have to optimize these ? Would it rank my site lower if they weren't optimized? And how can i do this sucessfully with little text on these pages ( I have indepth text on these subjects on my services & pricing pages? Kind Regards Cam
Intermediate & Advanced SEO | | hemeravisuals0 -
Unique domains vs. single domain for UGC sites?
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better: XXX,XXX pages on one site vs. A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site. The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary. Thoughts? Any other considerations we should be thinking about?
Intermediate & Advanced SEO | | intentionally0 -
SEO site Review
Does anyone have suggestions on places that provide in depth site / analytics reviews for SEO?
Intermediate & Advanced SEO | | Gordian0 -
Redirecting non www site
Hello Ladies and Gentlemen. I 100% agree with the redirecting of the non www domain name. After all we see so many times, especially in MOZ how the two different domains contain different links, different DA and of course different PA. So I have posed the question to our IT company, "How would we go about redirecting our non www domain to the www version?", "Where would we do that?", " we cant do the redirect on our webserver because the website is listed as an IP address, not a domain name, so would we do the redirect somewhere at GoDaddy?" who is currently maintain our DNS record So here is the response from IT: " I would setup a CNAME record in DNS (GoDaddy), such that no matter if you go to the bare domain, or the www, you end up in the same place. As for SEO, having a 301 redirect for your bare domain isn't necessary, because both the bare domain and the www are the same domain. 301 is a redirect for "permanently moved" and is common when you change domain names. Using the bare domain or the www are NOT DIFFERENT DOMAINS, so the 301 would not be accurate, and you'd be telling engines you've moved, when you haven't - which may negatively impact your rank. It sounds to me that IT is NOT recommending the redirect. How can this be? Or are we talking about two different things? Will the redirect cause the melt down as the IT company suggests? Or do they nut understand SEO?
Intermediate & Advanced SEO | | Davenport-Tractor0 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90