Website has been penalized?
-
Hey guys,
We have been link building and optimizing our website since the beginning of June 2010.
Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that.
Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly.
Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central.
We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly.
Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz.
The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links.
Any information that will help me get our rankings back would be greatly appreciated!
-
The links on our link exchange script accounted to about 2% of websites total links, most of our link building has been through natural articles and websites posting about us. So even if Google discounted our links via link exchange - this wouldn't of made us drop this much?
I agree with you. That's a very small number and unlikely to be the problem.
The duplicate content issue is fixed.
Excellent!
I have removed the link exchange script.
Good....
Even if I link as www.freemoviedb.com as anchor text, will it still help me rank for my keywords?
Yes, you'll want to keep a number of links out there with your keyword anchor text, as that still has a high effect on ranking for a particular term. But you'll want to present a link profile that has a more natural "mix" of keywords and your domain to avoid getting flagged as spammy.
-
The links on our link exchange script accounted to about 2% of websites total links, most of our link building has been through natural articles and websites posting about us. So even if Google discounted our links via link exchange - this wouldn't of made us drop this much?
The duplicate content issue is fixed.
I have removed the link exchange script.
Even if I link as www.freemoviedb.com as anchor text, will it still help me rank for my keywords?
-
Sure looks like either a penalty or a massive discounting of links to me. You're not banned, but you're way back in the results given your link profile. I took a quick look at your robots.txt and it looks fine.
If Google is still seeing 13,000 pages as duplicate content, this could be the issue as well, as Google's internal "quality score" on your site is not going to be pretty. But giving the number of inbound links you have, I'm much more inclined to think that Google has identified your link exchange script and discounted all links related to that.
There's some discussion out there also on whether you can get dinged for over-optimizing your anchor text. See this article. While that case study is a pretty small number of sites, if Tim's findings are accurate, you're definitely at risk. Both your internal and external links to your home page are virtually 99% "watch movies online" and "watch free movies online".
So here's what I would do:
#1 solve the duplicate content issue
#2 see what you can do to change the link exchange process to make it less recognizable to Google
#3 go vary your anchor text, adding new links and changing a few old ones like this profile link to use www.freemoviedb.com as the anchor text.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Target load time on ecommerce websites in 2017
I have a client that is redeveloping their website in Magento and is interested to know what their target page load time should be. I've read some stuff on this that's over a year old and curious if anyone has a census on what the averages are or what we should aim for. I know the simple answer is "as fast as it can be", but wondering if anyone has additional insight. Thanks!
Technical SEO | | aedesignco0 -
Page Authority for localized version of website
Hello everyone, I have a case here were I need to decide which steps to take to improve page authority (and thus SEO value) for the German pages on our site. We localized the English version into German at the beginning of 2015. www.memoq.com - English de.memoq.com - German By October 2015 we implemented href tags so that Google would index the pages according to their language. That implementation has been successful. There is one issue though: At that time, all our localized pages had only "1" point for Page Authority ("PA" in the Moz bar). At the beginning we though that this could be due to the fact that localization was done using subdomains (de.memoq.com) rather that subfolders (www.memoq.com/de). However, we decided not to implement changes and to let Google assess the work we had done with the href tags. Its been a while now, and still all our German pages keep having only "1" point for page authority. Plus we have keywords for which we rank in the top 10 in English (US Google Search), but this not the case for the translated version of the keywords for German (Germany Google search). So my question basically is: Is this lack of page authority and SEO value rooted in the fact that we used subdomain instead of subfolder for the URL creation. If so is it likely that Page Authority for German pages and SEO value will increase if I change the structure from subdomains to subfolders? Or is it that the problem in PA is rooted somewhere else that I am missing? I appreciate your feedback.
Technical SEO | | Kilgray0 -
To merge website and blog?
I was hoping to get a bit of advice if possible. Our website has a domain authority of 25. Our blog, on a separate platform has a domain authority of 73. In essence, we were wondering if we should move all the content from our blog to our website and then set up redirects? What's stopped us so far is the fear that the links from the blog, with the higher domain authority, are having a positive effect on the website, and if we were to move the content we'd lose the value of those backlinks. So I was wondering if you could possibly advise me on the matter? Any help would be much appreciated. Thanks
Technical SEO | | Stone_Junction0 -
Wordpress Website + 404 Errors
Hi everyone, I like to do a bit of auditing for our clients using SEOMoz. Once client that's using a Wordpress website had reported over a couple hundred 404 errors. However, when checking out the links, all the webpages (that I've tested) loaded just fine. Does anyone know why this would be the case? I thought, perhaps, the website might have gone down when it was crawling, but I have no evidence to back this up.
Technical SEO | | ThinkShiftInc0 -
Is pointing multiple domains to a single website beneficial for SEO or not?
A client has purchased many domains with keywords in each. They want to have us point each domain to their site for better SEO. Is this a good or bad thing to do?
Technical SEO | | thinkcreativegroup0 -
Help Website Plumetting :(
Hi I have been smacked by the penguin/panda and traffic plumetted back in April/May. We are still trying to recover and am looking at all of the potential issues. I have since cleaned up the site as much as i can and attempted to remove as much duplicate content as possible which is automatically generated by Zencart. We add content regularly and have new product reviews everyday and all product page are kept fresh as they show the last 12 customers engraving details which change daily on popular items. Could someone give me some pointers as i am hitting my head against the wall and only seeing traffic drop all the time, it's soul destroying just how much work i am putting into this every day without any effect. Site is www.keepitpersonal.co.uk Kind Regards Andy
Technical SEO | | SmithyWhiffy0 -
Penality issues
Hi there, I'm working on site that has been badly hit by penguin. The reasons are clear, exact match blog network links and tons of spammy exact match links such as comment spam, low quality directories, the usual junk. The spammy links were mainly to 2 pages, they were targetting keyword 1 and keyword 2. I'd like to remove these two pages from google, as they dont even rank in google now and create one high quality page that targets both the keywords, as they are similar. The dilemma I have is these spammy pages still get traffic from bing and yahoo and it's profitable traffic. Is there a safe way to remove the pages from google and leave them for bing and yahoo? Peter
Technical SEO | | PeterM220 -
Best way to display maintenence mode on a website?
I have a website with lots of traffic and sometimes the backends fail. I want to use lighttpd to show that the website is under mantenence and should be back up shortly. I was thinking of using Soft 503 errors or doing a 302 for every page to /maintenance.html. What would you do (besides fixing the backends, we are already doing that :P) to avoid hurting your SEO efforts? Thanks in advance Mariano
Technical SEO | | marianoSoler980