Identifying why my site has a penalty
-
Hi,
My site has been hit with a google penalty of some sort, but it doesn't coincide with a penguin or panda update. I have attached a graph of my visits that demonstrates this.
I have been working on my SEO since the latter part of last year and have been seeing good results, then all of a sudden my search referrals dropped by 70%.
Can anyone advise on what it could be?
Thanks!
Will
-
Great. Just audit it, fix problems, audit again, write more great content and give it time. Even if you fix the problem (assuming it was an onsite problem) it may take some time for Google to show the love agian.
-
Oh okay! That makes sense. Found a few issues with my php rules that automatically write links on a few of my contents pages.
I've learned some valuable tips here, such a fantastic help. I'm going to get the new site up in a week or two and we'll see if things change.
I'll keep you updated!
-
Ok so. If a bit of content resides at /bikes/mountain-bikes/ and the menulink I use is /bikes/mountain-bikes/ I'll get a status code 200. There is no added delay, no page rank lost, 200 == OK. The menu link points directly to the destination content.
Now lets say you've decided to change the location of that content to /bikes/mountain-bikes/index.html.
You do the 301 redirects on from the old url to the new one, THEN you need to update your links to reflect the new location so you're not just pointing at 301 redirects.
-
Thanks for the table of links. I'll see to it.
I'll work on the code on the new version of the site, seems pointless to do it now.
I've installed the plugin. How do I change the status code of a page? I don't really understand how it can be anything but 200, as if i'm viewing it it's obviously there! I thought 301's pushed the user to the 200 version of the page and only existed temporarily in the browser? Obviously I'm wrong, perhaps you could explain it for me?
Cheers for the screaming frog tool, looks great.
-
Did you change them. The scan I just did doesn't show them.... Maybe your host was getting funky or something lol.
Get this and click the links on your site. You want to link to status code 200, not 301
https://chrome.google.com/webstore/detail/server-status-code-inspec/bmngiaijlojlejaiijgedgejgcdnjnpk
I wouldn't de-index them, I havent found a legitimate reason to de-index anything since 2005, but im a programmer and normally don't need to patch things. You could probably quickly fix them just by adding some content/images.
im going to private msg you another spreadsheet. this should show you source+destination of your 404's and 301's.
btw, the spider im using is Screaming Frog, its the best I've found.
-
Just checked the 418's and they do seem to be already re-directed with 301's, or are actually in place. What would be the protocol here?
-
Got your message, thank you. What tool did you completed the crawl with? I'm sort of disappointed this stuff didn't come up in my seomoz weekly scans.
A few questions;
- How do i know where the 301's are being sent from? So in a this chain of events...
Link on a page on my site > routed via a 301 > landing on the desired page
... how do I find the first step in the process? the table you sent me seems to point out only the middle step.
- Yes the 'about us' and 'contact us' pages are weak. I'm building a new version of the site as we speak and will take care of it then. In the mean time, if i no-index them is that as good as getting rid of them?
I will now sort the 404's and 418's. Without wanting to sound like a broken record; thanks again! Do let me know if there is anything I can do in return once we've got to the bottom of this.
Will
-
private messaged you a google doc of the crawl. Looks like pages that no longer exist, they need 301's.
-
Wow, thanks for all this. It's late now in the UK so I'll check it out tomorrow.
Cheers
p.s. Where are my 418's coming from!?!
-
My crawl finished. You also have a bunch of status 418 "I'm a teapot" status codes. IDK what this is so I looked it up.
Per wikipedia:
418 I'm a teapot (RFC 2324)This code was defined in 1998 as one of the traditional IETF April Fools' jokes, in RFC 2324, Hyper Text Coffee Pot Control Protocol, and is not expected to be implemented by actual HTTP servers.
-
-
You'd think so, but 1) we cant fully trust everything Google says and 2) it could have been something that the algorithm progressively finds and penalizes.
Its possible that this is not related to links or content.
Take care of your RCS and make it awesome (real company shtuff)
About us (under construction content, not good)
Contact us (weak and thin, include social
FAQ
Terms and Conditions (404 error on your site!). I once broke all my footer links on a blog that was getting 5k/day and it slammed me down to 600/day nearly instantaneously. Ive seen other sites with 404 errors survive and even Cutts has downplayed the issue of 404 errors, but I believe any 404 can be indicative of a bad user experience. Scan your site for 404s and fix them all.
Also, many of your internal links appear to be pointing to 301 redirects. Update your links to point to the status 200 status code (directly to the destination, not through 301)
In just a quick overview, the above are my notes. This isnt a detailed audit, but you should scan your site for 404 errors and fix them, get your RCS stuff in order and conduct a full site review looking for anything that may be frowned upon by google.
-
Thanks devknob,
In answer to your questions;
-
it is across all organic traffic and all keywords to my entire site
-
the content on my site is fairly squeaky clean. I've been using the seomoz pro-tool to keep it in check. I use yoast seo for wordpress to handle my canonicals and employ no dodgy js hiding techniques. I did not remove content.
-
I haven't been buying links. I do have 20,000+ sitewide links coming from bikingbis.com and 12,000 sitewide links coming from citycyclingedinburgh.info/bbpress/. The ones from bikingbis have been removed and have requested removal of the other. Anchor text is varied and is mainly branded keywords
My question is though, if it's a bad backlink problem, wouldn't it coincide with a panda or penguin update?
Thanks again
Will
-
-
Check your analytics
- Is it a specific group of keywords?
- Is it organic traffic at all?
- Is it traffic to specific page or pages?
Check your website.
- Are your link canonicals setup CORRECTLY?
- Do you have content that is hidden via css/javascript and has no mechanism for unhiding?
- Have you changed alot of links recently and not performed 301 redirects?
- Do you have good content, title tags and meta descriptions?
- Did you remove content
Check your links
- Have you been buying links? Check your backlink profile using opensite explorer. Is there any unusual activity here?
- Is your anchor text varied?
Have you gotten a notice in Google Webmasters tools?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are links on sites that require PAD files good or bad for SEO?
I want to list our product on a number of sites that require PAD files such as Software Informer and Softpedia. Is this a good idea from an SEO perspective to have links on these pages?
White Hat / Black Hat SEO | | SnapComms0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Why did this fabric site disappear for "fabric" and why can't we get it back?
Beverlys.com used to rank on the first page for "fabric." I'm trying to get the date of their demise, but don't have it yet so I can't pinpoint what Google update might have killed them but I can guess. In doing a backlink analysis, there were hundreds of poor quality, toxic sites pointing to them. We have carefully gone through them all and submitted a disavow request. They are now on page 9 from nowhere to be found a week ago. But, of course, that's not good enough. They are on page 2 for "fabric online" and "quilt fabric." So Google doesn't completely hate them. But doesn't love them enough even for those terms. Any suggestions? They are rebuilding the site to use a different ecommerce platform with new content and new structure. They will also be incorporating the blog within the site and I've advised them on many other ways to attract traffic and backlinks. That's coming. But for now, any suggestions and help will be much appreciated. Something has got to be holding them back for that one gem of a keyword. Also, I would like to know what experiences others have had with the disavow request form. Does Google absolutely hold you to making every attempt you can at getting those links removed? ANd how does it know? No one responds so it seems to be such a waste of time. And many now actually charge to remove your links. Thoughts? Thanks everyone!
White Hat / Black Hat SEO | | katandmouse0 -
Why There is No link Data Available in my Webmaster Tools even the site has lots of links and webmastert tools account setup properly
i have few account in my webmaster tools that are not showing any link data even the has lots of links. i checked the setup and its everything is good. is some one tell me why there is no data coming through? Thanks
White Hat / Black Hat SEO | | OnlineAssetPartners1 -
Penalty removing company recommendation?
We've got a manual penalty, not sitewide, that we've been trying to remove and keep getting our reconsideration request denied. We also do not have the manpower to manually check backlinks, contact domain owners, etc anymore. Does anyone have recommendations on a company to use?
White Hat / Black Hat SEO | | CFSSEO0 -
Please Correct This on-site SEO strategy w/ respect to all the updates
Hello, I believe my on-site SEO process that I used to use a couple of years ago is not working well anymore for a couple of my sites, including this one. I'll tell you the old strategy as well as my new strategy and I'm wondering if you can give me pointers that will help us rank where we should rank with our PA and DA instead of getting moved down because of what could be our old on-site SEO. OLD ON-SITE SEO STRATEGY: Title tags usually match the page, but title tags occasionally on this site don't match the pages exactly. There's not many of them, but they do still exist in a couple of places. Title tags are either 1. A phrase describing the page 2. Keywords 1, Keyword 2 3. Keyword 1 | Keyword 2 4. Keywords 1, Keyword 2, branding The keywords are in the h1 and h2 of each main page, at the very top of the page. The h1 and h2 do not exactly copy the title tag, but are a longer phrase with the keywords appearing in their exact word order or in word variations. See this page for an example. Keywords occur 3-4 times in the body of the main pages (the pages with a menu link). Right now some of the pages have the exact phrases 3 or 4 times and no variation. meta description tags have exact keyword phrases once per keyword. Meta description tag are a short paragraph describing the page. No meta keyword tags, but a couple haven't been deleted yet. FUTURE ON-SITE SEO STRATEGY: I'm going to change all of the page titles to make sure they match the content they're on exactly. If the title is a phrase describing a page, I'm going to make sure a variation of that phrase occurs at least three times in the content, and once in the meta description tag. Title tags will be either a. Short phrase exactly matching page b. Keyword 1, Keyword 2 | branding c. Keyword 1 | branding 2. I'm thinking about taking out the H1 and H2 and replacing them with one tag that is a phrase describing the page that I'll sometimes put the keyword phrase in, only a variation in it and not the exact keyword phrase - unless it just makes total sense to use the keyword phrase exactly. **I'm thinking of only using the keyword phrase in it's exact words once on the page unless it occurs more naturally, and to include the keyword phrase in word variations two more times. So once (in non-exact word order) in the at the top, once (exact word order) in the text, and two more times (varied word orders) somewhere in the text. All this will be different if the keywords show up naturally in the text. **3. I'll delete all meta keyword tags, and still use exact keyword phrases in meta description tag, though I'll change the meta description tags to always very closely match what the page is about. Do you think my new strategy will make a difference? Your thoughts on any of this?****
White Hat / Black Hat SEO | | BobGW0 -
Non Manual penalties, should I trash my site?
My URL is: www.adserve.com.au I get no traffic from google and I am convinced that I have penalties from the links that point to my page. I have written to google previously and they told me that there are no manual penalties on the site. I give up... I am shelving my ENTIRE brand and starting again with a new site, http://www.trusignage.com, I do not want to do this but... If I do a search for
White Hat / Black Hat SEO | | AdAdam
"Using and implementing the AdServe digital menu board system couldn’t be easier! Just get any screen installed by a tradesman or electrician, plug the digital menu board device" two pages from within my site come up but my homepage does not, it comes up when you click on "In order to show you the most relevant results, we have omitted some entries very similar to the 2 already displayed" A search for
"The AdServe system comprises of only one tiny component that can plug directly into the HDMI port of a screen. Traditional digital signage systems require drilling into walls, running cables, a bunch of valuable space and the installation of several pieces of costly"
Brings up another 2 pages from my site, when clicking on "In order to show you the most relevant results, we have omitted some entries very similar to the 2 already displayed."
My homepage does not even come up... but the homepage of my new site http://www.trusignage.com comes up. My new site is at http://www.trusignage.com there is only 2 pages of duplicate content, the about us and the buy now page.
Is google going to penalise my new site? I WILL NOT DO ANY SEO, only on page......... I wont hire any SEO firm at all. My old site has a few great links to it
http://www.sixteen-nine.net/2013/06/24/android-digital-signage-closer-adserve/
http://www.crunchbase.com/company/adserve-digital-signage
I also have many of my REAL youtube videos that link to my site, maybe about 15
If I 301 redirect my penalised site to my new one am I just poisoning my new site as well? I could get the links changed instead. I will have to keep my old site www.adserve.com.au as I have customers who go to that site to lookup my contact details for support etc. will google see the same phone number and address etc and think I am trying to fill google up with duplicate websites? I would really prefer to keep www.adserve.com.au for Australian clients and usewww.trusignage.com for international clients, if the site layout is the same but all of the site passes copyscape then will I get hurt by duplicate content?
Google is ruining me.. I have no money to spend on adwords right now. I have a new highly inovative software product that has taken almost 2 years to develop and I think I deserve more than 4 visits per month. My actual business has been around for 7 years.
I invented SaaS digital signage in 2007 http://youtu.be/-YpyjLALoBU find me some web based digital signage system that was around prior to 2010?
This is me and my product http://youtu.be/ClXSiIA5DRY
Why should my site be treated as trash by google? I have in the past employed a SEO firm and if I search for "If you are looking for the top provider of digital signage in Australia, visit today" I find 70 absolute crap links to my site. I have disvowed them, there must be more links somewhere but I have no money or time to chase down site owners to remove them when I do not even know if I can get them all and have no guarantee that this will even help.. So bottom line, do I need to junk my www.adserve.com.au site? There is no getting away from what some SEO company has spammed in the past?
And again, using a tool to hunt down these spam links and try to get them removed will tie up my own time that needs to be spent on developing my software and I have no cash to pay people to do this for me. [edited by staff because line breaks weren't showing]0 -
Site Maps
I have provided a site maps for google but although it craws my site www.irishnews.com at 6:45AM the details in the site map are not seen on google for a few days - any ideas how to get this feature working better would be great. example <url><loc>http://www.irishnews.com/news.aspx?storyId=1126126</loc>
White Hat / Black Hat SEO | | Liammcmullen
<priority>1</priority>
<lastmod>2012-01-23</lastmod>
<changefreq>never</changefreq></url> thanks0