Do panda/penguin algorithm updates hit websites or just webpages ?
-
If I have a website that been affected by the panda/penguin update, do bad links affect the entire site or just the page the bad link(s) are linked to?
If it is the latter and penguin/panda actually affect webpages, not websites (as is the common reference/conception), then wouldn't simply creating a new URL, targeting this new URL, shifting meta-tags and restarting link-building efforts again (this time using the right quality strategies) be a really common-sense approach instead of the tediousness of the disavow approach that so many go down?
-
Do you have the penalty in available in your Google WT? What exactly does it say?
Most penguin penalties I've seen are partial matched and affecting only some incoming links. Removing pages won't help as you don't know which ones are affected.
Doing a link cleanup by going link after link asking for removal and then requesting a reconsideration may remove the penalty (in conjunction with the disavow file), but there's been several polls online and over 80% of the people who did this and got their penalty removed didn't see their rankings back. This is obviously true, as probably those manipulative links were the ones causing your high rankings and now gone you need to get some REAL, EARNED links to gain positions.
Truth be told, no one can tell what will happen, and you can only try.
The ultimate question always become: Is it really worth it? Or do I just start fresh (new Website, unrelated to the old one)?
-
Thanks Federico,
If a site has been affected with a "Partial Matches" hit, then can the simple shifting of targeted webpage solve the problem (in effectively allowing that targeted page to start again, without having to worry about disavow)?
-
Panda does not affect links, penguin does.
Panda is a "quality content" related update.
Penguin penalizes both pages and/or entire domains, even to the point of completely removing the from the index.
Under your Google Webmaster Tools, "Manual Actions" you can see the type of penalty the site has (if any) and what it is impacting.
"Sitewide" means that the entire site got the hit, while "Partial Matches" means some parts or some links.
Anyhow, if you have a manual penalty, you won't recover by just creating new pages. Although penalties expire, you cannot know which ones were hit.
You have to consider if the Website is worth saving or you can just start again fresh. If you have a Website 3 years old or more, and generating income, with a client base, then you probably start working on saving it. If your Website was new, not real client base or value, you rather start fresh.
Don't make the same mistakes again!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Covid19 algorithm update
So I've felt a rather large cliff dive on my major keywords but minor keywords are doing well. I've been told about a COVID19 update. Does anyone in here have any pointers or knowledge about this? I've dropped around 9 places and more importantly off the first page for my important keywords
Algorithm Updates | | Libra_Photographic0 -
HREFLANG for multiple country/language combinations
We have a site setup with English, German, French, Spanish and Italian. We offer these languages for every European country (over 30). Thus, there are 150 + different URL combinations, as we use the /country/language/ subdirectory path. Should I list out every combination in hreflang?Or should I simply choose the most applicable combinations (/de/de and fr/fr, etc.)? If we go the latter path, should I block google bot from crawling the atypical combinations? Best, Sam
Algorithm Updates | | JohnnyECCO0 -
What does it mean to build a 'good' website.
Hi guys. I've heard a lot of SEO professionals, Google, (and Rand in a couple of whiteboard Friday's) say it's really important to build a 'good' website if you want to rank well. What does this mean in more practical terms? (Context... I've found some sites rank much better than they 'should' do based on the competition. However, when I built my own site (well-optimised (on-page) based on thorough keyword research) it was nowhere to be found (not even top 50 after I'd 'matched' the backlink profile of others on page 1). I can only put this down to there being 'good quality website' signals lacking in the latter example. I'm not a web developer so the website was the pretty basic WordPress site.)
Algorithm Updates | | isaac6630 -
Website dropping from Page 1 pos 5 to no ranking and then back again?
Hi all, We have a very odd occurrence with a client of ours. It should be noted that they had a penalty recently removed about 2 months ago after much work from our company. Recently they started appearing back on Page 1 Google for a semi competitive keyword term. We were very happy with this and so was the client. The the ranking improved with our work to position 5, which was excellent. Unfortunately what has been happening is they have been dropping out of the rankings completely for this semi competitive keyword for a few days and then reappearing in the same position. The client is checking daily and has noticed. I thought this is just a 'hangover' from the Google penalty and perhaps a one off occurrence, but it has happened about 3 or 4 times now and seems to be happening every couple of weeks. Can anyone shed some light on this behavior? I have checked Webmasters Tools and everything is fine. Thanks Jon
Algorithm Updates | | Jon_bangonline0 -
Penguine 2.0 Confusion
I am getting seriously confused over our rankings and am hoping someone can give me a bit of clarity. When Penguin 2.0 hit last week we saw a drop of between 2 – 6 places on seven out of a set of around a dozen key phrases I monitor. We had been in the number 1 / 2 position for the majority of them over the last year – 18 months, and this drop has reduced our traffic by around 30 – 35%. I have looked at our back link profile - http://www.opensiteexplorer.org/links?site=www.wombatwebdesign.com%2F, and I don’t think it is that bad. Being a web design company the vast majority, 80% of the links are from the footer of client sites. We have a small, 5% number of links from comments; a similar amount of dead links and the rest are made up of the usual, profile, directory and article links. I was wondering if the way we link from client sites could be an issue, we link twice, once with the anchor text (Web Design Cumbria) and the second link is our company branding (Wombat Web Design, with the title tag – Web Design Cumbria). Could this be causing a problem? If anyone could point me in the right direction I would be most grateful. Thanks Fraser
Algorithm Updates | | fraserhannah1 -
Parallel mobile website
After researching and comparing desktop keywords vs mobile keywords, I came to the conclusion that they are pretty much the same, with minor variations. The new keywords for mobile website do not have a great potential for bringing in much traffic. If I don't want to go after those new mobile, do I need to build separate links to our parallel mobile website (URL) to rank better or the link building for my desktop website would be enough to rank for my parallel mobile website as well?
Algorithm Updates | | echo10 -
Google has indexed a lot of test pages/junk from the development days.
With hind site I understand that this could have been avoided if robots.txt was configured properly. My website is www.clearvisas.com, and is indexed with both the www subdomain and with out. When I run site:clearvisas.com in Google I get 1,330 - All junk from the development days. But when I run site:www.clearvisas.com in Google I get 66 - these results all post development and more in line with what I wanted to be indexed. Will 1,330 junk pages hurt my seo? Is it possible to de-index them and should I? If the answer is yes to any of the questions how should I proceed? Kind regards, Fuad
Algorithm Updates | | Fuad_YK0 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0