Devalued links or negative affect?
-
Hi there,
I'm looking into an issue with a site that was hit after Penguin was introduced.
The site lost 70% of traffic over night.
The site in question seemed to have a large number of backlinks with over optimized anchor text which seems to most likely be the reason for drop in rankings.
But there is also some links from blog networks here too unfortunately, so my question here really is do Google just devalue these links and discount them from consideration in their ranking algorithm or do the links still count but instead of adding positive affects in SERPs add a negative affect?
My reason for this question is I'm trying to determine whether it's worth saving this website or just starting fresh with a new domain.
That does bring me to another question, if I have to start fresh on a new domain is it a possibility to reuse the content from the old site? (providing I remove the URLs from Google via Webmaster tools).
Any help/advice/answers here would be greatly appreciated.
Thanks in advance.
-
If you have a manual penalty you will have a warning in your Webmaster tools
_Honestly speaking I did not know that thanks for the update .. _
-
If you have a manual penalty you will have a warning in your Webmaster tools.
Now, if a website didn't have webmaster tools, you could set it up and then file a reconsideration request. If there was no manual warning to start with you will get a notice from Google telling you so.
However, when you ask for a reconsideration request you are opening yourself up to potentially have a manual review from Google. So if you're not squeaky clean you could end up attracting yourself a manual penalty on top of the algorithmic issues you have.
As far as diagnosing Penguin, here is some information on how to diagnose Penguin, but it's not always a simple diagnosis.
-
_But how do you know that your website is hit by penguin? I hope there is no way to tell whether a website is hit by penguin or manual penalty. _
-
Hi Marie,
That's a great response and inline with our thinking here. The links are not within our control and we've decided to start a fresh.
The site content ranked really well before Penguin so I am hoping it will recover fast.
Thanks and best regards,
Jason
-
A reconsideration request will not help if there is no manual warning in WMT. Penguin is algorithmic.
-
Ok, Here is the thing.
Did you send a reconsideration request?
_If not, please send reconsideration request after getting rid of some spammy links. Make sure you have listed all the URLs where the link references are still available in a separate Google Spreadsheet File along with the reconsideration request. If you get a response that no manual action is taken, we can be sure of one thing that your website is hit by algo shuffle and this will make things murkier.
Now if your website is hit by manual penalty, you will get a response that the manual penalty is partially removed or not removed at all._
_Now, as some reputed online marketers say if you have not built those links, you would not have to care for them at all but if you have done it themselves, you need to get them down. _
-
I really do think that sites with bad links are penalized as opposed to just losing the link juice from those links. I am working on a site right now that was ranking well for years. Then they hired an SEO to try to rank even better. The SEO built a bunch of anchor texted links and on April 24 (Penguin) their rankings plummeted.
No one knows exactly what is necessary for recovery from Penguin. I think a site can recover if the backlink issue is an easy one to fix. For example, the SEOMoz article on WPMU recovery showed that they were able to remove a pile of footer anchor texted links and regain their rankings with the Penguin refresh on May 25. But for most sites, if you've got anchor texted links in a bunch of places, recovery is pretty much impossible.
In doing unnatural links penalty removal I have found that maybe 15% of webmasters respond to my requests to remove links. For some niches that number is higher. But in order to recover from Penguin I'm guessing you'll need 85-95% of bad links removed and that is probably not going to happen.
I'd start fresh. Definitely don't redirect the old domain to the new.
You can noindex all of the content on the old domain and reuse it on the new domain. It may help to go into webmaster tools for the old domain and ask Google to remove the old urls from the index.
Of course, you'll be starting fresh and have to earn good quality backlinks. Good luck!
-
Hi Alison, thanks for your help with this.
We started contacting webmasters initially however this proved to be a waste of time for the most part as the majority of webmasters didn't respond to requests.
A new site is looking like the way to go, thanks again.
-
Thanks Deb Dulal Dey, unfortunately there are too many links to make this worth while doing. On the other hand the content on the site is very good though.
Thanks again for your thoughts.
-
Thanks Baptiste, you've given me a lot to think about there.
-
Well, @Jason Brooks
Sorry to say but you need get rid of these crappy links otherwise your website will never be able to recover from Penguin update. And in the mean time, you need to make your website awesome by publishing great content that will help you earn some quality links the natural way.
-
Hi Jason,
To answer the first question, low quality links can have a negative effect on rankings, particularly those associated with link networks or if the links look manipulative. That being said, most sites have some sort of spammy sites linking to them for reasons beyond their control, and Google don't seem likely to penalise a small amount of these - they will probably just ignore the links and discount any value that they would have passed.
Have you tried to clean up your link profile by contacting the webmasters of the blog networks and asking them to remove the links?
Starting completely from scratch seems a little extreme, but if you feel that the links are very extensive and hard to rectify, and if the current domain isn't ranking and doesn't have much authority, then it might be the easiest way to "start fresh". Bear in mind that a new domain is likely to be sandboxed and will take a substantial amount of time to gain trust and authority. It would be fine to reuse the content provided that the original content is removed and deindexed.
Good luck.
-
Hi Jason,
With infos from the latest slideshare of Ian Howells, http://slideshare.net/ianhowells/life-after-penguin, I think some of these links are devaluated, and some are penalizing the site. You may remove them and confess to Google, or start on a new domain, or maybe use a new URL for every page, including the homepage.
This is a though question, penguin recovery is still an unknown process and nothing is guaranteed.
About content re-use, Howells did put the same content on another page, without 301 and it worked. Maybe you can put 404 or remove the content and put it on a fresh domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How should we handle re-directory links? Should we remove these links?
We are currently cleaning up bad links that were purchased by a previous SEO agency. We have found links on anonym.to pages that redirect traffic to our site automatically. How should this be handled? Should we remove these links?
Technical SEO | | Lorne_Marr0 -
Transferring link juice on a page with over 150 links
I'm building a resource section that will probably, hopefully, attract a lot of external links but the problem here is that on the main index page there will be a big number of links (around 150 internal links - 120 links pointing to resource sub-pages and 30 being the site's navigational links), so it will dilute the passed link juice and possibly waste some of it. Those 120 sub-pages will contain about 50-100 external links and 30 internal navigational links. In order to better visualise the matter think of this resource as a collection of hundreds of blogs categorised by domain on the index page (those 120 sub-pages). Those 120 sub-pages will contain 50-100 external links The question here is how to build the primary page (the one with 150 links) so it will pass the most link juice to the site or do you think this is OK and I shouldn't be worried about it (I know there used to be a roughly 100 links per page limit)? Any ideas? Many thanks
Technical SEO | | flo20 -
Page for Link Building
Hello Guys, My question is about a link building process. We all know that some directories/sites do require a reciprocal link. Does it make any sense to creat a page in website exclusively to reciprocal links? And what we do with this webpage in terms of indexing, do folow, crawling...etc. Any sugestions are more then welcome 🙂 Tks in advance! PP
Technical SEO | | PedroM0 -
We registered with Yahoo Directory. Why won't this show up as a a linking root domain in our link analysis??
Recently checked our link analysis report for 2 of our campaigns who are registered in the dir.yahoo.com (yahoo directory). For some reason, we don't see this being a domain that shows up as linking to our website - why is this?
Technical SEO | | MMP0 -
Changing Domain Registrars have an affect on rankings?
So, I just wanted to move my domain from GoDaddy to NameCheap. I am still thinking about the move as I am not sure what affect it will have on my rankings. Has anyone here done it before? Did you see any drop in rankings? Also, while I was at it - I was thinking maybe I can add WhoisGuard too. I am not too paranoid about people getting my info from WhoIs, but does having a private Whois have an affect on ranking?
Technical SEO | | uzair0 -
Internal linking with Old Content
Hello, I have a sports website in which users write their opinions about the sporting events that take place every day throughout the year. Each of these sporting events generates a new page or URL indicating the match with date. For example: www.domain.com/baseball/boston-v-yankees-04-24-2012-1234.html The teams face several times a year, and each match creates a different URL or page. I would like to link old pages to new pages and vice versa. How would you recommend these pages to be linked? Linking them to each other or linking old pages to new pages that are generated or otherwise? I would appreciate your orientation and help in this case. Thank you.
Technical SEO | | NorbertoMM1 -
Linking to unrelated content
Hi, Just wanted to know, linking to unrelated content will harm the site? I know linking to unrelated content is not good. But wanted to know weather any chances are there or not. I have a site related to health and the other one related to technology. The technology site is too good having PR 6 and very good strong backlinks. And the health related site has very much tough competition, So i wanted to know may be i could link this health site to technology site to get good link from it. Can you suggest me about it. waiting for your replies...
Technical SEO | | Dexter22387874870 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0