My Website Just Got Penalized
-
I had a website that recently got penalized. The pagerank dropped to zero on the homepage and moved to page 200 on google. I checked manual actions on my site in web mastertools and it says that no webspam is found. I am curious to find out why my website would drop. I had a a network of 5 blogs that I was linking to the site that also lost page rank but theres is N/A now. I am thinking thats where the trouble started because i did not use no follow.
Question 1
My question is if I remove all the links to the other site or make them no follow will the penalty lift. I am thinking that the penalty is an automated on and not a manual one. Does any one have experience with automated penalties? Did they lift after you fixed the issues. Did you regain most of your original rankings?
Question 2
What happens to all my blogs. I spent all lot of money on have posts written for it. Can any of the content be salvaged. I have over 1000 pages written on 5 different blogs.
I can send you a list of the urls so you can see what I am talking about.
-
There was an update today and it was pretty ugly. If your drop was in the last 24-48 hours, that might have been the culprit.
Did you use C-class IPs on the blogs? Or were they hosted on different hosts? Did you interlink the blogs?
Theres some silly rankings going on, sites with 1 backlink ranking for 25k monthly searched keyword.
-
Hi,
-
Like Ash said, it's probably an algorithmic ranking penalty. Especially if you do not have anything in the Manual Actions section of Webmaster Tools. Just fixing the problems will solve the ranking issues, but it will take a wile. A lot of people have experience with these kind of issues, and they regained back the rankings. Don't worry, it can be done Just read on Moz or other SEO resources about concrete examples if you need some facts.
-
If it's a penalty caused only by the linking practices then you should not worry about the content. But be careful, because we see a lot of penalties caused by the actual content. If you paid for the content, chances are even higher. If I were you I would check:
-
if there are other linking issues besides inter-blog link
-
if content is really unique and has high quality.
And start fixing the problems.
-
-
URLs would help.
Your articles are a major suspect and links another - Panda and Penguin. Yes, algorithmic ranking drops rather than a manual penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I create a new Website to promote just one set of services from a list of several services?
Hi, I have a 10 years old website, where I promote all my services - around 30 of them under 5 main categories. For example, my current website promotes these services. A service - with a1, a2, a3 services B service - with b1, b2, b3 services C service - with c1, c2, c3 services D service - with d1, d2, d3 services E service - with e1, e2, e3 services Now I want to promote just "A service" with its sub-services into a separate website, as that service is in demand now and also those keywords should be my main keywords. I want to connect my old website with the new one, to increase the trust among users. Can I do this? I hope I am not violating any Google rules by doing this. Please help with suggestions. Thanks. Jessi.
White Hat / Black Hat SEO | | Sudsat0 -
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Hacked Websites (Doorways) Ranking First Page of Google
Hello Moz community! I could really use your help with some suggestions here with some recent changes I've noticed in the Google serps for terms I've been currently working on. Currently one of the projects I am working on is for an online pharmacy and noticed that the SERPs are being now taken up by hacked websites which look like doorways to 301 redirect to an online pharmacy the hacker wants the traffic to go to. Seems like they may be wordpress sites that are hacked and have unrelated content on their websites compared to online pharmacies. We've submitted these issues as spam to Google and within chrome as well but haven't heard back. When searching terms like "Canadian Pharmacy Viagra" and other similar terms we see this issue. Any other recommendations on how we can fix this issue? Thanks for your time and attached is a screenshot of the results we are seeing for one of our searches. 1Orus
White Hat / Black Hat SEO | | monarkg0 -
My site just dropped significant!
Just noticed that my website onlinecasting.co.za just dropped 50+ places on basically all the keywords I'm following.
White Hat / Black Hat SEO | | KasperGJ
I can also see, that today there almost havent been any new sign-ups so something happened.
I didnt change anything. On issue, which might have something to do with it, is that I own several "copies" of the same site, just in different countries (domains). I host the websites myself, and they are all on the same server. The text and design in the same in some of the countries except that "jobs" are unique for the country. I also have:
onlinecasting.ae (english)
onlinecasting.sg (english)
onlinecasting.mx
and more coming So, could that be the reason, that google somehow now decided, that it wont accept the "allmost same site"?0 -
A Sitemap Web page & A Sitemap in htaccess - will a website be penalised for having both?
Hi I have a sitemap url already generated by SEO Yoast in the htaccess file, and I have submitted that to the search engines. I'd already created a sitemap web page on the website, also as a helpful aid for users to see a list of all page urls. Is this a problem and could this scenario create duplicate issues or any problems with search engines? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Switching existing website to a Wordpress Site and afraid of losing top spot
I am going to be switching my current site from a standard html site to a wordpress site. I'm kind of paranoid of losing my top spot for the keyterms. If I keep the content the same, and keep the same amount of image alt tags, the same anchor text etc, nothing should change right? Grateful for any advice. Thanks Will
White Hat / Black Hat SEO | | willie790 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0