Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is it possible to Spoof Analytics to give false Unique Visitor Data for Site A to Site B
-
Hi,
We are working as a middle man between our client (website A) and another website (website B) where, website B is going to host a section around websites A products etc.
The deal is that Website A (our client) will pay Website B based on the number of unique visitors they send them.
As the middle man we are in charge of monitoring the number of Unique visitors sent though and are going to do this by monitoring Website A's analytics account and checking the number of Unique visitors sent.
The deal is worth quite a lot of money, and as the middle man we are responsible for making sure that no funny business goes on (IE false visitors etc). So to make sure we have things covered - What I would like to know is
1/. Is it actually possible to fool analytics into reporting falsely high unique visitors from Webpage A to Site B (And if so how could they do it).
2/. What could we do to spot any potential abuse (IE is there an easy way to spot that these are spoofed visitors).
Many thanks in advance
-
You might be better with a server side tracker like http://awstats.sourceforge.net/
The answer from Mat probably has the best logic, but the only problem is are you legally responsible for mitigating the possibility of fraud?
I would make sure you add this to the contract, as I am not sure you are going to be able to defeat a proxy or spoofer, just in case the referrer gets smart and decides to work the system.
An anti fraud system can be put into place, but LOL I am not sure you will have the access to the multi million dollar fraud monitoring tools that Google does, that are contstantly updated and algorithmically and systematically monitor as well as have auditors who manually do random checks...
-
Hi - Well we are really just acting on behalf of the client - that's what they want.
Also its only visitors from that specific website (very close niche) - not just any site
-
Google Analytics doesn't report IP Address though - which is another reason to take a different root. Not knocking GA, I love it. However it isn't the right tools for this.
I suspect that the fiverr gigs use ping or something the create the mass of "unique visits". Very easy to spot. Unless you have some fairly sophisticated tools to hand i'd imagine that any method that can deliver 5000 for $5 is going to be pretty easy to spot.
Might try it now though. I love fiverr for testing stuff
-
If you must use Analytics, I would drill down to the source of referral within analytics. This will give you the URL, page, or whatever. I think you can also drill down to the referring IP etc...
You need to log were they come from through them. Export your results every month and see a pattern.
If you get 500 referrals from website B's IP or URL, then its a sure way of knowing they are throwing people at you.
But Mats answer is best, will give you times, not just dates and will also give you more detailed info.
-
My question is: is unique visitors the right metric that you should be measuring? On Fiverr.com I can get 2000 to 10,000 unique visitors for $5. http://fiverr.com/gigs/search?query=unique+visitors&x=0&y=0
Can you tie your metrics to something else that might have more value for you, such as purchases, newsletter signups (still easy to fake, but at least takes a little more time), etc?
-
Google Analytics isn't designed to pull the data in the way you really want to for something like this. It can be done I suppose, but it'd be hard work.
There are only so many metrics you can measure, and all are pretty easy to fake. However having the data is an easy to access form means that you can spot patterns and behaviour, which are much harder to fake.
Probably a starting point would be to measure distribution of the various metrics on the referred traffic v the general trend. If one particular C class block (or user agent, or resolution, or operating system, or whatever) appeared at a different frequency in the paid traffic that would be a good place to look deeper.
Thinking less technically for a moment though, I bet you could just implement one of the many anti click fraud systems to do most of this for you. same idea, but someone else has already done the coding. Googling for click fraud brings up a stack of ads (tempting to click them loads and set off their alarms!!).
-
Hi Mat,
A very informative answer.
If someone is going to try and spoof analytics, then would they not also be able to equally try and fool the script?
If someone was to try this do you know how they would likely try and do it - essentially if I know what is likely to be tried, then I can work out something that could counteract it. Are there certain things that can't be fooled, or are very difficult to fool ? - EG things like browser resolution, location etc - or are this just as easy to spoof as anything else?
many thanks
-
It isn't hard to fake this at all I am afraid. Spotting it will depend on how sophisticated the person doing it is.
My personal preference would be not to use analytics as the means of counting it. Doing that you are going to be slightly limited in the metrics you have available and will always be "correcting" data and looking for problems rather than measuring more correctly and having problems spotted.
I'd have a script on page that logs that checks for a referrer and it if matches the pattern for website B creates a log record instead.
You then have the ability to set your rules. For instance if you get 2 referrals from the same IP a second apart would you count them? What about 10 per hour 24 hours a day? You can also log the exact timestamp with whatever variables you want to collect, so each click from the referring site might be recorded as:
- Time stamp
- Exact referring URL
- User agent
- IP
- Last visit (based on cookie)
- Total visits (based on cookie)
- #pages viewed (updating cookie on subsequent page views )
- and so on
Analytics doesn't give you access to the data in quite the same way. I'd definitely want to be logging it myself if the money involved is reasonable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Merging Niche Site
I posted a question about this a while ago, but still haven't pulled the trigger. I have a main site (bobsclothing.com). I also have a EM niche site (i.e shirtsmall.com). It would be more efficient for me to merge these site, because: I would have to manage content, promos, etc. on a single site. In other words, I can focus efforts on 1 site. If I am writing content, I don't have to split the work. I don't have to worry about duplicate content. Right now, if I enter a product URL into copyscape, the other sites is returned for many products. What makes me apprehensive are: The niche site actually ranks for more keywords than the main site, although it has lower revenue. Slightly lower PA, and DA. Niche site ranks top 20 for a profitable keyword that has about 1300 exact match searches. If you include the longer tail versions of the keyword it would be more. If I merge these sites, and do proper 301s (product to product, category to category) how likely is it that main site will still rank for that keyword? Am I likely to end up with a site that has stronger DA? Am I better off keeping the niche site and just focusing content efforts on the few keywords that it can rank well for? I appreciate any advice. If someone has done this, please share your experience. TIA
Intermediate & Advanced SEO | | inhouseseo0 -
Moving to a new site while keeping old site live
For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?
Intermediate & Advanced SEO | | kdaniels0 -
Structured Data + Meta Descriptions
Hey All, Was just looking through some google pages on best practices for meta descriptions and came across this little tidbit. "Include clearly tagged facts in the description. The meta description doesn't just have to be in sentence format; it's also a great place to include structured data about the page. For example, news or blog postings can list the author, date of publication, or byline information. This can give potential visitors very relevant information that might not be displayed in the snippet otherwise. Similarly, product pages might have the key bits of information—price, age, manufacturer—scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a book. " This is the first time I have seen suggested use of structured data in meta descriptions. Does this totally replace a regular meta description or will it work in conjunction with the regular meta description? If I provide both structured data and text, will the SERP display text and the structured data the way it was previously displayed? Or will the 150 -160 character limit take precedence and just cut off all info after that?
Intermediate & Advanced SEO | | Whebb0 -
URL mapping for site migration
Hi all! I'm currently working on a migration for a large e-commerce site. The old one has around 2.5k urls, the new one 7.5k. I now need to sort out the redirects from one to the other. This is proving pretty tricky, as the URL structure has changed site wide. There doesn't seem to be any consistent rules either so using regex doesn't really work. By and large, the copy appears to be the same though. Does anybody know of a tool I can crawl the sites with that will export the crawled url and related copy into a spreadsheet? That way I can crawl both sites and compare the copy to match them up. Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
Why does a site have no domain authority?
A website was built and launched eight months ago, and their domain authority is 1. When a site has been live for a while and has such a low DA, what's causing it?
Intermediate & Advanced SEO | | optimalwebinc0 -
Franchise sites on subdomains
I've been asked by a client to optimise a a webpage for a location i.e. London. Turns out that the location is actually a franchise of the main company. When the company launch a new franchise, so far they have simply added a new page to the main site, for example: mysite.co.uk/sub-folder/london They have so far done this for 10 or so franchises and task someone with optimising that page for their main keyword + location. I think I know the answer to this, but would like to get a back up / additional info on it in terms of ranking / seo benefits. I am going to suggest the idea of using a subdomain for each location, example: london.mysite.co.uk Would this be the correct approach. If you think yes, why? Many thanks,
Intermediate & Advanced SEO | | Webrevolve0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90 -
Backlinks from Chinese Big sites
Hello, I wish I know your position regarding backlinks from chinese websites. I am able to get a text link(from homepage) from a very big site in chinese. It has PR8 and over 10M users monthly. My site is in english. Will it help me ? Will I be penalised (my site is 5 years old, PR4) and some decent traffic(6-7k daily) Thanks!
Intermediate & Advanced SEO | | adresanet0