Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Interesting case of IP-wide Google Penalty, what is the most likely cause?
-
Dear SEOMOZ Community,
Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips.
We are very interested in the community's help and judgement what else we can try to uplift the penalty.
As quick background information,
-
The sites in question offers sports results data and is translated for several languages.
-
Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish>
-
The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same
-
A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships
-
There are some promotional one-way links to sports-betting and casino positioned on the page
-
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
-
All sites have a strong domain authority and have been running under the same owner for over 5 years
As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains.
Our questions are:
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
-
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
-
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
-
Are there any other factors/metrics we should look at to help troubleshooting the penalties?
-
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
Any help is greatly appreciated.
SEOMoz rocks. /T
-
-
Thanks tomypro.
-
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
Thanks again for your thoughts. This is actually a topic I am very involved with. I work as a Technical Director in a large digital agency and our SEO team just recommended a large Fortune 100 customer to break their web property into market ttlds from .com/de, .com/es etc into .com .es .de using the same top love root domain. According to our SEO team DA is sort of shared if the same root domain is used. However, local ttlds will obviously give you better rankings in local Google engines.
My thoughts are the right approach probably depends on the size of your brand. If its easy for you to build up quickly DA for local ttlds are preferred. If you are a smaller player you might run better consolidating everything under one umbrella to share DA.
I am actually running an experiment for one of my projects where I am doing the ttld breakout for one domain to compare organic search traffic. the benefit with local ttlds is that eventually you can tie those to market-local servers which boosts again SEO in local markets. This isn't possible for directories.
Do you share my thoughts Ryan? As said, this is a very hot topic for me at this moment.
P.S. I will definitely reach out for recommendations - thank you.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Atul,
What I mean by it is that all domains hosted under the same server IP (=dedicated root server) have experienced significant ranking drops that seem tied to a global penalty.
However it is questionable if Google would be considering this a valid approach given the probability that other domains could be hosted under the same IP that are not associated with the to-be-penalized URL.
-
Ryan, i would like to know what is meant by IP specific penalty ?
-
Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
Generally speaking, you will achieve the best results by consolidating your sites under one domain with a dedicated folder for each country as you described. I would recommend delaying the move until you are sure your sites are not under any penalty.
The advantage you will receive with a single root domain is the consolidation of your Domain Authority. It sounds like your sites were doing well before the penalty. The higher DA can help even further.
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties?
Correct, as long as the sites are properly set up to target their target countries. Sites which are dedicated to a specific locale and language would not normally compete in SERPs with other sites that offer similar content in another country and language.
Does your company have that experience and do you provide such services?
While I appreciate the inquiry, my resources have been already dedicated for the remainder of this month. You could take a look at the SEOmoz directory. Please note that anyone can list their company in the directory. A listing is not an endorsement.
If you desire a further recommendation you can send me a message on SEOmoz and I will respond. I can share a few names of SEOs whom I have confidence in based on their Q&A responses, blogs and reputation if that would be helpful.
-
Ryan,
Thank you for your thoughtful answers. Couple of clarifications:
The internationalized sites are each hosted with a different root domain keyword1.es keyword2.de - are you still confirming that this should not be causing duplicate content penalties? Due to the penalties we have been considering moving everything under one umbrella and manage local sites in directories e.g. .com/es/keyword1 .com/de/keyword2 - however until the penalties hit the url approach has worked very well for us. Any thoughts?
I should clarify the comment on auto-linkbuilding. The company used LinkAssistant to research potential partners, i.e. a lot of link solicitation emails were sent but the actual link building was still performed manually only with legitimate and contetn relevant partners.
We are not working with our old SEO agency any longer and have been reaching out to a couple of external SEO resources/experts but have not been presented with a conclusive, convincing concept to resolve the issues. I guess it takes a resource with experience in handling Google penalties to do the job. Does your company have that experience and do you provide such services?
-
Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster?
Think of Google as an intelligent business. They have processes which algorithmically penalize websites. They also have systems which flag sites for manual review. When a penalty is deemed appropriate it is possible for it to be applied on any number of factors such as an IP address, a Google account, a domain, etc. It depends on how widespread of a violation has occurred.
What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue?
You mentioned a few points which can potentially lead to a penalty. I am not clear from your post, but is sounds like you may be linking to casino and gambling sites. While those sites may be legitimate, many have a reputation for using black hat SEO techniques.
If you want to remove a penalty, be certain that you do not provide a followed link to any questionable site. When you provide a followed link to a site, you are basically saying "I trust this site. It is a good site and I endorse it". If you are found to offer a link to a "bad" site, your site can be penalized.
What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links
Hire a professional SEO to review your site. You want to review every page to ensure your site is within Google's guidelines. I am highly concerned about your site's links to external sites. I am also concerned about the automated link building that your current SEO has been doing. A professional SEO company should not lead your site to incur a penalty. I am having difficulty understanding how this happened in the first place, how it has not been fixed in almost a year, and how this SEO company is building links for you. Frankly, it's time to consider a new SEO company.
Translating content to other languages is fine. You can take the exact same article and offer a translated version for each language, and even country. For example you can offer a Spanish version for your Spain site, and a different Spanish version for your Mexico site. As long as these sites are targeting specific countries then there is no duplicate content issues.
After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first?
The penalty would follow to your new domain.
The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword
Not good at all.
Summary: your site needs careful, professional review by a SEO professional who adheres to white hat techniques. Every day your site is penalized you are losing traffic and money. The cost you pay to fix this issue may be extremely small in comparison to the amount of revenue you have lost.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Significant "Average Position" dips in Search Console each time I post on Google My Business
Hi everyone, Several weeks ago I noticed that each Wednesday my site's Average Position in Search Console dipped significantly. Immediately I identified that this was the day my colleague published and back-linked a blog post, and so we spent the next few weeks testing and monitoring everything we did. We discovered that it was ONLY when we created a Google My Business post that the Average Position dipped, and on the 1st July we tested it one more time. The results were the same (please see attached image). I am 100% confident that Google My Business is the cause of the issue, but can't identify why. The image I upload belongs to me, the text isn't spammy or stuffed with keywords, the Learn More links to my own website, and I never receive any warnings from Google about the content. I would love to hear the community's thoughts on this and how I can stop the issue from continuing. I should note, that my Google My Business insights are generally positive i.e. no dips in search results etc. My URL is https://www.photographybymatthewjames.com/ Thanks in advance Matthew C0000OTrpfmNWx8g
White Hat / Black Hat SEO | | PhotoMattJames0 -
Why is this site ranked #1 in Google with such a low DA (is DA not important anymore?)
Hi Guys, Would you mind helping me with the below please? I would like to get your view on it and why Google ranks a really new domain name #1 with super low domain authority? Or is Domain Authority useless now in Google? It seems like from the last update that John Mueller said that they do not use Domain Authority so is Moz Domain Authority tool not to take seriously or am I missing something? There is a new rehab in Thailand called https://thebeachrehab.com/ (Domain authority 13)It's ranked #1 in Google.co.th for these phrases: drug rehab thailand but also for addiction rehab thailand. So when checking the backlink profile it got merely 21 backlinks from really low DA sites (and some of those are really spammy or not related). Now there are lots of sites in this industry here which have a lot higher domain authority and have been around for years. The beach rehab is maybe only like 6 months old. Here are three domains which have been around for many years and have much higher DA and also more relevant content. These are just 3 samples of many others... <cite class="iUh30">https://www.thecabinchiangmai.com (Domain Authority 52)</cite>https://www.hope-rehab-center-thailand.com/ (Domain Authority 40)https://www.dararehab.com (Domain Authority 32) These three sites got lots of high DA backlinks (DA 90++) from strong media links like time.com, theguardian.com, telegraph.co.uk etc. (especially thecabinchiangmai.com) but the other 2 got lots of solid backlinks from really high DA sites. So when looking at the content, thebeachrehab.com has less content as well. Can anyone have a look and let me know your thoughts why Google picks a brand new site, with DA 13 and little content in the top compared to competition? I do not see the logic in this? Cheers
White Hat / Black Hat SEO | | igniterman75
John0 -
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Whats up with google scrapping keywords metrics
I've done a bit of reading on google now "scrapping" the keywords metrics from the analytics. I am trying to understand why the hell they would do that? To force people to run multiple adwords campaign to setup different keywords scenario? It just doesn't make sense to me...If i am a blogger or i run an ecommerce site...and i get a lot of visit regarding a particular post through a keyword they clicked on organically. Why would Google wanna hide this from people? It's great Data for us to carry on writing relevant content that appeals to people and therefore serves the need of those same people? There is the idea of doing White Hat SEO and focus on getting strong links and great content etc... How do we know we have great content if we are not seeing what is appealing to people in terms of keywords and how they found us organically... Is google trying to squash SEO as a profession? What do you guys think?
White Hat / Black Hat SEO | | theseolab0 -
Footer Link in International Parent Company Websites Causing Penalty?
Still waiting to look at the analytics for the timeframe, but we do know that the top keyword dropped on or about April 23, 2012 from the #1 ranking in Google - something they had held for years, and traffic dropped over 15% that month and further slips since. Just looked at Google Webmaster Tools and see over 2.3MM backlinks from "sister" compainies from their footers. One has over 700,000, the rest about 50,000 on average and all going to the home page, and all using the same anchor text, which is both a branded keyword, as well as a generic keyword, the same one they ranked #1 for. They are all "nofollows" but we are trying to confirm if the nofollow was before or after they got hit, but regardless, Google has found them. To also add, most of sites are from their international sites, so .de, .pl, .es, .nl and other Eurpean country extensions. Of course based on this, I would assume the footer links and timing, was result of the Penguin update and spam. The one issue, is that the other US "sister" companies listed in the same footer, did not see a drop, in fact some had increase traffic. And one of them has the same issue with the brand name, where it is both a brand name and a generic keyword. The only note that I will make about any of the other domains is that they do not drive the traffic this one used to. There is at least a 100,000+ visitor difference among the main site, and this additional sister sites also listed in the footer. I think I'm on the right track with the footer links, even though the other sites that have the same footer links do not seem to be suffering as much, but wanted to see if anyone else had a different opinion or theory. Thanks!
White Hat / Black Hat SEO | | LeverSEO
Jen Davis0 -
Can a Page Title be all UPPER CASE?
My clients wants to use UPPER CASE for all his page titles. Is this okay? Does Google react badly to this?
White Hat / Black Hat SEO | | petewinter0 -
Will Google Penalize Content put in a Div with a Scrollbar?
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique? Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
White Hat / Black Hat SEO | | BrandLabs0