Identifying why my site has a penalty
-
Hi,
My site has been hit with a google penalty of some sort, but it doesn't coincide with a penguin or panda update. I have attached a graph of my visits that demonstrates this.
I have been working on my SEO since the latter part of last year and have been seeing good results, then all of a sudden my search referrals dropped by 70%.
Can anyone advise on what it could be?
Thanks!
Will
-
Great. Just audit it, fix problems, audit again, write more great content and give it time. Even if you fix the problem (assuming it was an onsite problem) it may take some time for Google to show the love agian.
-
Oh okay! That makes sense. Found a few issues with my php rules that automatically write links on a few of my contents pages.
I've learned some valuable tips here, such a fantastic help. I'm going to get the new site up in a week or two and we'll see if things change.
I'll keep you updated!
-
Ok so. If a bit of content resides at /bikes/mountain-bikes/ and the menulink I use is /bikes/mountain-bikes/ I'll get a status code 200. There is no added delay, no page rank lost, 200 == OK. The menu link points directly to the destination content.
Now lets say you've decided to change the location of that content to /bikes/mountain-bikes/index.html.
You do the 301 redirects on from the old url to the new one, THEN you need to update your links to reflect the new location so you're not just pointing at 301 redirects.
-
Thanks for the table of links. I'll see to it.
I'll work on the code on the new version of the site, seems pointless to do it now.
I've installed the plugin. How do I change the status code of a page? I don't really understand how it can be anything but 200, as if i'm viewing it it's obviously there! I thought 301's pushed the user to the 200 version of the page and only existed temporarily in the browser? Obviously I'm wrong, perhaps you could explain it for me?
Cheers for the screaming frog tool, looks great.
-
Did you change them. The scan I just did doesn't show them.... Maybe your host was getting funky or something lol.
Get this and click the links on your site. You want to link to status code 200, not 301
https://chrome.google.com/webstore/detail/server-status-code-inspec/bmngiaijlojlejaiijgedgejgcdnjnpk
I wouldn't de-index them, I havent found a legitimate reason to de-index anything since 2005, but im a programmer and normally don't need to patch things. You could probably quickly fix them just by adding some content/images.
im going to private msg you another spreadsheet. this should show you source+destination of your 404's and 301's.
btw, the spider im using is Screaming Frog, its the best I've found.
-
Just checked the 418's and they do seem to be already re-directed with 301's, or are actually in place. What would be the protocol here?
-
Got your message, thank you. What tool did you completed the crawl with? I'm sort of disappointed this stuff didn't come up in my seomoz weekly scans.
A few questions;
- How do i know where the 301's are being sent from? So in a this chain of events...
Link on a page on my site > routed via a 301 > landing on the desired page
... how do I find the first step in the process? the table you sent me seems to point out only the middle step.
- Yes the 'about us' and 'contact us' pages are weak. I'm building a new version of the site as we speak and will take care of it then. In the mean time, if i no-index them is that as good as getting rid of them?
I will now sort the 404's and 418's. Without wanting to sound like a broken record; thanks again! Do let me know if there is anything I can do in return once we've got to the bottom of this.
Will
-
private messaged you a google doc of the crawl. Looks like pages that no longer exist, they need 301's.
-
Wow, thanks for all this. It's late now in the UK so I'll check it out tomorrow.
Cheers
p.s. Where are my 418's coming from!?!
-
My crawl finished. You also have a bunch of status 418 "I'm a teapot" status codes. IDK what this is so I looked it up.
Per wikipedia:
418 I'm a teapot (RFC 2324)This code was defined in 1998 as one of the traditional IETF April Fools' jokes, in RFC 2324, Hyper Text Coffee Pot Control Protocol, and is not expected to be implemented by actual HTTP servers.
-
-
You'd think so, but 1) we cant fully trust everything Google says and 2) it could have been something that the algorithm progressively finds and penalizes.
Its possible that this is not related to links or content.
Take care of your RCS and make it awesome (real company shtuff)
About us (under construction content, not good)
Contact us (weak and thin, include social
FAQ
Terms and Conditions (404 error on your site!). I once broke all my footer links on a blog that was getting 5k/day and it slammed me down to 600/day nearly instantaneously. Ive seen other sites with 404 errors survive and even Cutts has downplayed the issue of 404 errors, but I believe any 404 can be indicative of a bad user experience. Scan your site for 404s and fix them all.
Also, many of your internal links appear to be pointing to 301 redirects. Update your links to point to the status 200 status code (directly to the destination, not through 301)
In just a quick overview, the above are my notes. This isnt a detailed audit, but you should scan your site for 404 errors and fix them, get your RCS stuff in order and conduct a full site review looking for anything that may be frowned upon by google.
-
Thanks devknob,
In answer to your questions;
-
it is across all organic traffic and all keywords to my entire site
-
the content on my site is fairly squeaky clean. I've been using the seomoz pro-tool to keep it in check. I use yoast seo for wordpress to handle my canonicals and employ no dodgy js hiding techniques. I did not remove content.
-
I haven't been buying links. I do have 20,000+ sitewide links coming from bikingbis.com and 12,000 sitewide links coming from citycyclingedinburgh.info/bbpress/. The ones from bikingbis have been removed and have requested removal of the other. Anchor text is varied and is mainly branded keywords
My question is though, if it's a bad backlink problem, wouldn't it coincide with a panda or penguin update?
Thanks again
Will
-
-
Check your analytics
- Is it a specific group of keywords?
- Is it organic traffic at all?
- Is it traffic to specific page or pages?
Check your website.
- Are your link canonicals setup CORRECTLY?
- Do you have content that is hidden via css/javascript and has no mechanism for unhiding?
- Have you changed alot of links recently and not performed 301 redirects?
- Do you have good content, title tags and meta descriptions?
- Did you remove content
Check your links
- Have you been buying links? Check your backlink profile using opensite explorer. Is there any unusual activity here?
- Is your anchor text varied?
Have you gotten a notice in Google Webmasters tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Redesigning my site, and not sure what is best for seo, subfolders or direct .html links?
,I have 4 examples to choose from, what is best:? http://hoodamath.com/games/dublox/index.html http://hoodamath.com/games/dublox.html http://hoodamath.com/dublox/index.html http://hoodamath.com/dublox.html
White Hat / Black Hat SEO | | hoodamath0 -
Is this a 'real site' or a spam site for backlinks
I have been asked what type of site this is? What kind of page is this? [http://www.gotocostarica.com/](http://www.gotocostarica.com/) In my opinion it is site put up to create back links and should be avoided (especially in the light of the new Penguin and Panda updates coming). But I don't want to give wrong advice. What are your opinions?
White Hat / Black Hat SEO | | Llanero0 -
What to do about "Penguin" penalties?
What are your suggestions about moving forward with sites hit by the "Penguin" penalties? Wait it out and see if the penalty goes away Try to remove spammy backlinks and resubmit (is this worth the time and effort) Build quality backlinks to offset (will this even work if they have thousands of spammy links) Blog more (I think this is probably a no brainer) Scrap the site and start from scratch (This is last resort and don't want to do this if at all possible) Or any other ideas are greatly appreciated
White Hat / Black Hat SEO | | RonMedlin0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0 -
Can good penalize a site, and stop it ranking under a keyword permanently
hi all we recently took on a new client, asking us to improve there google ranking, under the term letting agents glasgow , they told us they used to rank top 10 but now are on page 14 so it looks like google has slapped them one, my question is can google block you permanently from ranking under a keyword or disadvantage you, as we went though the customers links, and removed the ones that looked strange, and kept the links that looked ok. but then there ranking dropped to 21, is it worth gaining new links under there main keyword even tho it looks like google is punishing them for having some bad links. the site is www. fine..lets...ltd...co....uk all one word cheers
White Hat / Black Hat SEO | | willcraig0 -
How much pain can I expect if I change the URL structure of the site again?
About 3 months ago I implemented a massive URL structure change by 'upgrading' some of the features of our CMS Prior to this URL's for catergorys and products looked something like this http://www.thefurnituremarket.co.uk/proddetail.asp?prod=OX09 I made a few changes but din't implement it fully as I felt it would be better to do it instages as the site was getting indexed more thouroughly. HOWEVER... We have just hit the first page for some key SERP's and I am wary to rock the boat again by changing the URL structures again and all the sitemaps. How much pain do you think we could feel if i went ahead and optimised the URL's fully? and What would you do? 🙂
White Hat / Black Hat SEO | | robertrRSwalters0