Guides to determine if a client's website has been penalized?
-
Has anyone come across any great guides to pair with client data to help you determine if their website has been penalized?
I'm also not talking about an obvious drop in traffic/rankings, but I want to know if there's a guide out there for detecting the subtleties that may be found in a client's website data. One that also helps you take into account all the different variables that may not be related to the engines.
Thanks!
-
Good point about the Change History - at least that will catch things like new filters.
Understood about the external doc being easier for some clients not used to working in Analytics. When going that route, I at lease try to get them working in a shared document - either in Google Docs or at least a shared doc in a DropBox or something.
That way there are fewer issues with trying to figure out who has the most current version, and it's available to you when you need it - like when you are doing a normal monthly review but want to quickly check what you think might be an anomaly - without having to make a request for the doc.
P.
-
Thanks Paul. Maybe now that Analytics will be showing change history that will also save time too.
I like your idea about an external document. That seems more approachable than having our clients who are not so comfortable make their annotations in Google Analytics.
-
Glad it was helpful, Ellen, and thanks for the kind words.
And you've hit exactly the challenge so many of us encounter with clients - we're just not kept in the loop of all the things they're doing. (and this critically includes changes to site code and analytics as I mentioned, not just marketing efforts) The only effective way I've found is to make the client responsible for enforcing that their devs and marketers keep the annotations (or if necessary some external file, blech) up to date.
Otherwise you can easily waste hours chasing an issue, only to later find out someone goofed up changing a filter in analytics, for example. Not that that has ever happened to me
That's how I pitch this to clients - either work out a way to make certain it's kept up to date on an ongoing basis, or risk having to pay me for many hours of extra work every time I'm asked to try to track down an issue or to assess overall traffic health. Or maybe even waste big money with a wrong diagnosis because not enough info was provided.
More and more, effective analysis needs to take into account many more cross-channel aspects. The only way to do this effectively is to have those cross-channel efforts recorded in some way right in the Analytics.
Good luck!
Paul
-
It's definitely not a helluva lot to throw at me, but instead is exactly the sort of thing I was looking for! Thank you so much Paul for such a thorough answer. This is definitely the direction I've been moving toward.
The most difficult thing with my clients is there are so many hands in the pot, and they often don't let us know what's going on with their external marketing efforts. Additionally, some of our clients have multiple marketing agencies making changes to their websites, and there are multiple admins on the Analytics.
Holding the client responsible for the updates is a great idea, and it would just take a lot more prodding on our end.
This discussion was spurred by a client that saw a major decline in non-paid search traffic (Google only) over the last quarter of 2012. There were no penalties in GWT, so this leads me to believe as you said it was an algo update. I'm going to use your tips to try and further isolate the affected areas.
I really appreciate the time you took to answer my question. Thanks again.
-
Ellen, there are just far too many reasons why a site's traffic might fluctuate to ever be able to have a guide that can allow for them, then detect the ones that are "harmful".
This is why SEO is referred to as both art and science, unfortunately.
First - to be clear... If a site has actually been penalized by the search engines, they will send you notification through Bing and Google Webmaster Tools. So you must be certain that both those tools have up-to-date, monitored email address in their notifications settings.
Once you've discounted actual penalties, you're left with fluctuations due to changes in the algorithm. These are not "penalties" in the search engines' eyes, just corrections to the ranking algorithm that happen to affect you.
The best method I've found for spotting these is a combination of segmenting data, keeping accurate records about your own site marketing activities, and monitoring the dates of announced algorithm changes.
The idea here is to try to eliminate as many variable as possible for why traffic might have changed, making it easier to spot changes attributable specifically to algorithm changes. Which would then point you to tactics you might need to use to mitigate the effect of the algo change.
In order to do this, you'll need to do the following:
Track Marketing Efforts & Site Changes
Keep records about your site marketing and structural, coding changes.Anytime you do marketing that could affect site traffic, enter the date and info as an Annotation in your Google Analytics. This includes on and offline things like launching magazine or radio advertising, adding new banner or PPC ads, getting coverage in the media, etc. Anything that could conceivably be causing more people to become aware of you and search for your site. (Remember, just because your ad gives your website address doesn't mean people will remember it. Many will remember your company name or service and will Google it later)
Also keep track of ANY changes made to your website structure - changes in code, robots.txt, .htaccess, canonicals, Analytics configuration etc
Track Announced Algorithm Changes
Use this page http://www.seomoz.org/google-algorithm-change to constantly add dates and info about algo updates into your site's AnnotationsSegment Data Ensure you're only looking at organic search data.
This may seem obvious, but a lot of people miss it. Algorithmic changes are only going to affect your organic search data. So you must ensure you are only looking at non-paid search traffic in your analysis. Fortunately, there's a pre-built Advanced segment for that in Google Analytics. If you're trying to track Google changes specifically, you can further segment to show only Google traffic (i.e. exclude Bing and Yahoo.)
Bonus tip - non-sampled data only Make sure you've asked Analytics to show your reports using as little data-sampling as possible. This will make the data vastly more accurate, although the reports will be a little slower. Definitely a worthwhile tradeoff.
Here's my method:
- pick a date range you're concerned about - the shorter the range the easier to spot anomalies in the graph. Six to eight weeks at a time may be best. Or pick a short range from before and after a date you think you encountered a problem
- segment your data specifically to Google unpaid search, and select as little data-sampling as possible
- look at the traffic line in your graph. Anywhere you see an unexpected drop in traffic (allowing for weekly fluctuations) look for an annotation below that date that might explain it.
So in practice the process might look like this. I'm worried about a traffic drop toward the end of January. I select a date range in Analytics of Jan 1 to Feb 15, and I segment my data to show just Google non-paid traffic for that range. I hit the little checkerboard icon in the top right under the date range and move the slider for Highest Precision.
After dong this, I look at the general pattern of organic traffic to the site. There seems to be the usual ebb and flow of lower weekend traffic, with a small spike of traffic on Jan 15th. When I look just below that date, I notice that I've entered an Annotation. (It'll show up as a little tiny clickable comment bubble). When I read the Annotation I created, it tells me we got a mentioned in the newspaper that day. So I now remember where that spike came from. The traffic then pretty much settles back to normal a day or two after as expected.
Then I notice an unusual drop in traffic around January 23. When I again check for Annotations I've created, I realize there was a Panda update on Jan 22. Since there was no other marketing activity mentioned around that date (like a radio ad ending, for example) I can be pretty sure the sole cause of that drop was the Panda change. And since I know Panda is mostly about devaluing thin or low-value content, I now have somewhere to start looking.
I would then start looking at the non-paid search traffic from specific keywords to see if any group of keywords suffered most heavily. If I can find a pattern to the search terms that dropped, I know I've found the topic area of my site where I need to build some better content to help recover the traffic.
The reason i need to be tracking the marketing efforts as well as the algo updates, is I don't want to miss-attribute the problem. (Correlation instead of causality) For example, if I had a major marketing campaign wrap up on the 20 or 21st of January, that could very well have accounted for the traffic drop, and the Panda update was merely a coincidence.
I wouldn't want to have wasted a whole lot of time chasing the Panda problem, when in fact it was a normal drop due to a cutoff in the marketing campaign. But i would have missed that if I hadn't been tracking the marketing. (For clients sites, you'll need to make the client responsible for keeping the marketing Annotations up to date.)
As you can see, this isn't easy, and it takes laying some groundwork, but it goes a long way to helping you figure out where to focus when you start trying to figure out whether you've been affected by an algo update, and therefore where to spend your energy on fixes.
I know this a helluva lot to throw at you, but the question you asked doesn't have an easy answer and i didn't want to shortchange you with an overly simplistic response. Be sure to ask followup questions for the stuff I haven't' explained clearly enough.
Hope that helps.
Paul
-
Hi,
I am not sure if there is a guide that can tell you exactly why your ranking went down. There are many factors as you mentioned that can cause this such as competitor doing better SEO, getting more quality links and etc. It would be nice to have a guide to learn about what is happening. I may be wrong about this. Hope someone can shed light in this.
-
Thanks TommyTan.
I am definitely referring to the search engines.
I do use Google Webmaster Tools, and haven't seen any email notifications from GWT regarding any spammy link profiles, etc.
I'm more concerned with finding out if it's just normal fluctuation in keyword ranking and traffic, or if there is something else going on.
Sometimes there's so many factors that all could be playing a role outside of any penalties, it would be great to find a guide to help you diagnose if it's just seasonal traffic, keyword rank fluctuation or something more serious.
-
Hi,
I may be way off the chart here but when we talk about website' being penalized, I believe it is mostly related to the search engine. I am not sure what would be penalized that may not be realted to the engines. The best tool that connects to a user's website is the Webmaster Tool. If anything is detected, the webmaster will be notified and a notice will also be available on the Google Webmasters Tool such as unnatural linking, duplicate HTML tags and etc.
Hope this helps =]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why google is not visiting any website from past 10 days
Hi, I observed why Google is not visiting www.SubhaVastu.com from past 10 days, later I checked thoroughly, not only for my site, google stopped visiting all websites from past 10/12 days. Is Google releasing any new updates to the crawler? Any new system is releasing soon. I am expecting Google updated their crawler by this Sunday night and it may visit as usual to all sites from 12-midnight pacific time. Has anyone observed it, any information regarding on this Google step. Thanks.
Algorithm Updates | | SubhaVaastu0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
URL in SERP: Google's stand
Months back, we can notice "keyword" will be bold and highlighted if its in the SERP URL. Now Google no more highlights any URLs even with exact match of keyword we search. Beside UI, Does this mean Google might devalued or reduced the importance of URL as ranking factor? We can see many search results match partially or completely in URL with search keywords.
Algorithm Updates | | vtmoz0 -
Question: About Google's personalization of search results and its impact on monitoring ranking results
Given Google's personalization of search results for anyone who's logged into a Google property, how realistic and how actually meaningful/worthwhile is it to monitor one's ranking results for any keyword term these days?
Algorithm Updates | | RandallScrubs0 -
Using a sites custom code for multiple websites: good or bad?
Is it bad to utilize a custom codebase for multiple websites? Does that play a factor within Google? Also, what about hosting sites with the same custom codebase on the same dedicated server?
Algorithm Updates | | WebServiceConsulting.com0 -
301'ing away from an exact match domain.
Hi Moz Community! My website gets just over 50% of its traffic from ranking in the top 3 in over 10 countries for my exact match keyword domain. 80% + from keywords related to the exact match domain. We are now looking at doing a to 301 re-direct to a new domain to start a fresh branding to the site to increase scope and expand. This would involve removing the keyword from the homepage and domain entirely . However. Considering all competitors ranking for our main keyword, have the keyword in their domain as either a subdomain to or in their root domain and in their homepage content, would this make ranking without the keyword in domain & content hard? I have found a very similar example that has done so, so I guess the answer to that question is no its not. about 65-70% of our anchor text on our backlinks is for our domain keyword. Can anyone advise how best to go about maintaining rankings after 301ing or how best to go about 301ing to make sure that we can maintain the rankings for our main keyword! Any advise at all would be greatly appreciated, Thanks.
Algorithm Updates | | howiex10 -
Google's reaction to site updates
Hi, Is it safe to assume as soon as Google indexes updates I've made to my site that any ranking changes the updates effected will happen at that same time, or is there ever a lag time before these changes ( if any ) take effect?
Algorithm Updates | | minutiae0 -
Conveying Farmer Update To Client
I work with a site that saw their super competitive top terms drop off page one with the Farmer update. So, #4 to #12.... that kinda thing. In the last year they've added a huge catalog of 500,000 item pages. The catalog has climbed to a 76% bounce rate, where as the handful of top pages is in the 20s +/-. To date, I haven't had much of anything to do with the catalog. That makes for a sitewide average bounce rate of almost 70% which has almost doubled in the past year as the catalog has ramped up. The catalog gets a ton of search traffic and sells a lot of items via that organic traffic. I'm advocating for a variety of measures, including cleaning up the catalog: 301ing out of stock pages to the homepage 301ing 100% bounce rate pages who've had hundreds/thousands of visits over time.. Improving the user experience. Offering rainchecks for out of stock items. They generally don't believe that the huge bounce rate (bad user experience stats) is hurting their top terms on their top pages. They see it as two different issues. Any thoughts on how to present evidence that the catalog is the culprit? In researching it, I found these two quotes: "In particular, it's important to note that low quality pages on one part of a site can impact the overall ranking of that site," the Google spokesman said. and... "Google spokesman told PCMag that sites that believe they have been adversely impacted should "extensively evaluate their site quality." Not only that, but the item descriptions are straight from the manufacturer, so the pages aren't that unique text-wise. Any industry standard on catalog page bounce rates? Not that it's the only possible area of SEO improvement, because it's not. I thought those quotes were pretty conclusive, but I guess not. Is there some straight-from-Google additional info to suport this? Or, am I just wrong to focus on user experience... bounce rate, pages, time on site, etc? Thanks! Mike
Algorithm Updates | | 945010