Partial Manual penalty to a URL
-
Hi Mozers,
I have a website which has got a partial manual penalty on a specific url. That url is of no use to the website now and is going to be taken off in 3 months time as the website is going to be completely redesigned. Till then I dont wont to live with the partial manual penalty for this url. I have few things in mind to tackle this:
1. take out the url from the website now (as the new redesign will take 3 months)
2. take out internal links pointing to this url in question
3. file for reconsideration with google stating we have taken off the url and have not generated any backlinks and the backlinks are organic. (no backlinking activity has been done on this website or the url)
Please let me know if this works or i will have to get the backlinks removed then the disavow then the reconsideration.
Looking forward for ur response
-
I'm in agreement Robert. Hitesh, it does feel like we're missing some part of the story. I have reviewed hundreds if not thousands of sites that were dealt unnatural links penalties and I have yet to see one that was given unfairly. I have seen the occasional example unnatural link given that truly was natural, but I've never seen a site get a penalty when all they had were natural links.
Again, if you'd like to share the url I'll take a look and give you my thoughts. But other than that I think any answer that you'll get here is going to just be speculation.
-
Hitesh,
I have looked at this and read your other comments like those to Marie. Unfortunately, a feeling remains that I am not seeing everything. From your reply to Marie you show a bit more of the Google message: "Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole."
Then you name some of the sources for the links and you also state that this page has some info regarding IP's in various countries and people are linking because of that, "which is totally natural." Also, "There are no unnatural links to this url but for the fact most of them are coming from forums and spammy sites."
I really get the feeling you are trying to define or redefine what "natural" is instead of realizing the problem you have and that it may shortly involve much or all of your site. You have been warned by Google and the easiest thing to do is to read what Jane Copeland wrote on the 30th and follow that direction:
I'd do a combination of trying to remove the links, disavowing what I couldn't remove, removing the page with a 410 and filing for reconsideration explaining what I did and how I've tried to fix it. I'd also explain that the page was obsolete to begin with and was always destined for the scrap heap.
Failing to take this action very soon could really negatively impact your site. Defining what is or is not natural will not help you.
Good luck,
Robert
-
I think the best place to start would be to contact the site owner, and see if they would be willing to remove the link pointing your way. If not, then use the disavow tool in webmaster tools.
If you have a bad feeling about a link, there is probably a good reason for that feeling. Try using Blacklistalert.com to see if the domains your site is listed on are blacklisted with any dns providers. You can also try MXtoolbox.com to see if thier IP address has been compromised. If you see any of the sites in question fail the test, then I would immediately remove the link by either of the methods mentioned at the start of this post.
Best of luck, I really hope you get it figured out.
-
Interesting. That type of penalty, to just one url is uncommon. Can you tell that there are unnatural links there? You can pm me the url if you'd like me to take a look. Can you tell why they were created?
I would probably still clean up the links to this page which means making efforts to remove them and then disavowing what you can't get removed. While Google says that they are no longer counting these links, we still don't know 100% whether they could affect you algorithmically such as in the eyes of the Penguin update.
-
the screenshots
-
Hi Marie,
Thanks for the response!
Yes the links are gained naturally. No efforts are taken for link building in our case. It was a useful file which users linked previously.
I have attached screenshots of the inbox message and the manual actions tab. Please have a loom and let me know, if the link removal needs to be done for the whole site or just the URL.
In my opinion i feel just the url as the penalty is only on the url and clearly google mentions that in both the messages
"As a result, Google has applied a manual spam action to ixx.xxxxxxxxxxg.info/node/view/54. There may be other actions on your site or parts of your site."
and
"Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole."
Looking forward for your response
-
Hi Robert,
I agree and will do the clean up act, disavow and reconsideration. But now the question is do i have to clean the links pointing to the whole site or just the url? As i have received manual penalty just for the url which is a sub-domain on the site and not the whole site.
have a look at the screenshot for the warnings received in both inbox and manual actions tab!
It clearly states it is just for the sub-domain url
Let me know your views
-
Is it possible you could post a screenshot of what you are seeing in your manual actions viewer? Or, tell us what wording is in there? Does the message tell you that it is just one particular page on your site that is being affected? Is it an unnatural links warning?
"...have not generated any backlinks and the backlinks are organic. (no backlinking activity has been done on this website or the url)"
The vast majority of the time when a site owner gets a penalty and says that there are no unnatural links to their site, they actually HAVE created links that are unnatural. A good example is a site that has done widespread guest posting for links. Many site owners have a hard time understanding that those links are actually unnatural. However, if you are certain that you have done no link building to this page (assuming it is a single page that has been targeted) and you have an unnatural links warning, then is it possible that someone else has been building links to it? An example would be if you wrote a story about a particular company that put that company in a favorable light and then that company built links to your site in order to boost their story higher in the SERPS.
If you'd like to PM me the url and the details of your penalty I'd be happy to take a look.
-
I would say that it depends on why the penalty happened in the first place, but if it's a manual penalty then removing the resource probably won't get rid of the penalty overnight. I'd do a combination of trying to remove the links, disavowing what I couldn't remove, removing the page with a 410 and filing for reconsideration explaining what I did and how I've tried to fix it. I'd also explain that the page was obsolete to begin with and was always destined for the scrap heap.
-
If you first remove the url, even with the 410, I do not believe you will get any action on the penalty in terms of a reconsideration. Remember, with a reconsideration Google wants to see penance. Removing the issue is not penance, it is easy in their eyes.
Yes, these actions remove the issue, but I am not sure they will have an affect as far as reconsideration.I am certainly open to being wrong.
Best -
1. Make sure you have no internal links pointing to that page
2. Put a rule in place with a 410, meaning GONE before filing the reconsideration request.
3. Do not redirect the page with a 301 or any other method. Remember, you want the page to disappear, not redirect.Also, what is the message you received stating that only that one URL was penalized? Very strange to hear that only one was affected. Run a link check to see what other sites or listings are pointing to that URL, and if possible, log in to the citation or platform and change the link to one you know is not affected.
-
Hitesh,
Just so I am clear, you got a partial manual penalty on a single url? While it seems odd to me, most who come to us have partial or full penalties that are affecting their entire sites. My concern with not taking an effort to clean it up, file a disavow.txt file covering any remaining links, and requesting consideration is that it might leave you open for further urls and even affect the new site. This would be assuming you are going to 301 the old url's to the new site. Even without the "bad" url, there is the potential for carryover IMO around the site having been assessed a penalty and never addressed it.
So, if you have the time, clean it up and then file for reconsideration.
Best
-
unfortunately taking the url out and taking internal links away will not get the penalty removed you need to work on getting external links removed for it as that's where the penalty has come from. You can disavow them (I also recommend dropping them an email) if you don't want the page. There are some great guide here on Moz if you take a quick search.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Algorithm non-manual penalty. How do we fix this (quality?) drop?
Hi, See attached image. We received a non-manual penalty on March 22, 2015. I don't think we ever came out of it. We have moved up due to the Penguin update, but we should (by DA PA) be up on the first page for tons of stuff and most keyword are lower than their true strength. What kind of quality errors could be causing this? I assume it was a quality update. I am working on the errors, but don't see anything that would be so severe as to be penalized. What errors/quality problems am I looking for? We have tons of unique content. Good backlinks. Good design. Good user experience except for some products. Again, what am I looking for? Thanks. non-manual-penalty.png
White Hat / Black Hat SEO | | BobGW0 -
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Want to Remove numbers from Old Post URL - Will it effect its Ranking?
Hi. I have a number of posts that are ranking in google for several keywords. However the URLs contain numbers, for example 2011, 2014 and 35. I want to remove these numbers to make the URLs more updated. If I use the 301 redirect for old URL to the new one, will I retain the same ranking for these blogposts Or it can effect the ranking. Does anyone have tried this in the past? I would like to get your opinion on this. Thanks in advance.
White Hat / Black Hat SEO | | techmaish0 -
URL Shortners Question
Does anyone know if there are any URL shortners that track when googlebot visits them? I want to know when googlebot visits a shortened link that does NOT got to a URL I control. Any ideas would be much appreciated.
White Hat / Black Hat SEO | | gazzerman10 -
What penalty would cause this traffic drop (Google Analytic Screenshot)
This ecommerce site was hit (mostly) slowly by updates but there is nothing in GWT. Below is the graph. Keep in mind that most of our traffic is return customers, so the drops don't look dramatic, but they are. "New Visitors" doesn't show the drop. This is a "Daily" Google Analytics setting. The drop I've circled is May 23-May 24, 2013. It was a huge hit in non-return customers. This graph is "Unique Visitors" I don't know why the "New Visitors" graph is not showing the dip Although we had some big drops, a lot of the drop was gradual. Any help in identifying what could be causing the problem is appreciated. ga.png
White Hat / Black Hat SEO | | BobGW0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Solved PayDay hack - but SERPs show URLs - what should I do?
We had the PayDay hack - and solved it completely. The problem is - the SERPs have over 3,000 URLs pointing to 404 on our website all of which have urls that are like this: <cite>www.onssi.com/2012/2/post1639/payday-loan-companies-us</cite> What should I do? Should I disavow every one of the 3,000? No Follow?
White Hat / Black Hat SEO | | Ocularis0 -
URL Structure - forward slashes, hyphen separated, query paramters
I am having difficulty evaluating pros and cons of various URL structures with respect to SEO benefits. So I can have the following 1. /for-sale-in-<city>-<someothertext>-<uniqueid>.php
White Hat / Black Hat SEO | | proptiger
So in this case a term like 'for sale in San Francisco' is directly part of the URL. </uniqueid></someothertext></city> 2. /for-sale/<city>/<someothertext>uniqueId
Here 'for sale in San Francisco' is not so direct in the URL, so I think. Also I 'heard' that forward slash URLs are somehow considered as being 'lower down' in the directory structure. </someothertext></city> 3. /for-sale/<city>/<someothertext>/?pid=uniqueId</someothertext></city> someOtherText contains keywords we are targeting. 1. Is there a preference of one format over the other? 2. Does it even matter? 3. someOtherText - does it makes sense to put keywords in the URL for just SEO purposes? I do not per se need someOtherText for functionality.0