Penguin & Panda: Geographic Penalities?
-
Has anyone ever come across information about a website appearing strongly in SERP's in one region, but poorly in another? (ie: great in Europe, not so great in N. America)
If so, perhaps it is a Panda or Penguin issue?
-
There are many factor which influences a site's ranking in one region or another.
Google is heading toward personalized results based on user location, but that's no panda or penguin issue.
Some things to consider are:
- the location of your hosting server
- the tld of your site
- the location and tld of your linking sites
- the location you set up in your gwtools or in your buiness page
- the content of your site, if it's english it may be more british focused or american
- and other ranking factors which only google knows
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameter Being Improperly Crawled & Indexed by Google
Hi All, We just discovered that Google is indexing a subset of our URL’s embedded with our analytics tracking parameter. For the search “dresses” we are appearing in position 11 (page 2, rank 1) with the following URL: www.anthropologie.com/anthro/category/dresses/clothes-dresses.jsp?cm_mmc=Email--Anthro_12--070612_Dress_Anthro-_-shop You’ll note that “cm_mmc=Email” is appended. This is causing our analytics (CoreMetrics) to mis-attribute this traffic and revenue to Email vs. SEO. A few questions: 1) Why is this happening? This is an email from June 2012 and we don’t have an email specific landing page embedded with this parameter. Somehow Google found and indexed this page with these tracking parameters. Has anyone else seen something similar happening?
Intermediate & Advanced SEO | | kevin_reyes
2) What is the recommended method of “politely” telling Google to index the version without the tracking parameters? Some thoughts on this:
a. Implement a self-referencing canonical on the page.
- This is done, but we have some technical issues with the canonical due to our ecommerce platform (ATG). Even though page source code looks correct, Googlebot is seeing the canonical with a JSession ID.
b. Resubmit both URL’s in WMT Fetch feature hoping that Google recognizes the canonical.
- We did this, but given the canonical issue it won’t be effective until we can fix it.
c. URL handling change in WMT
- We made this change, but it didn’t seem to fix the problem
d. 301 or No Index the version with the email tracking parameters
- This seems drastic and I’m concerned that we’d lose ranking on this very strategic keyword Thoughts? Thanks in advance, Kevin0 -
Have You Ever Used Tools Such as; TribePro, Onlywire & SocialAdr?
Hi all, We create approximately 8-10 circa 800 word articles every week with the majority being added to our blog - I do tend to internally link from the blog to relevant pages on our website... I have been introduced to some 'syndication tools' such as; TribePro Onlywire SocialAdr ...and wondered whether these are deemed as 'gaming the system' and so potentially 'grey hat'? I would appreciate comments from anyone who has a view and particularly from anyone who has possibly used any of these tools... Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Penalized because of Pharma Wordpress Hack, Fixed, When can we expect to get out?
Hey Guys, so one of our clients hired a web designers to re do his site. Unfortunately in the process the client got a nasty pharma hack and we had to completely re do his site in drupal by scratch because it was so difficult to remove the hack. In this process his lost all his rankings, sub 100 and the hack produced super low quality links from drug related sites pointing to his pages. We're 100% certain the hack is gone, we've disavowed every link, and used WMT to deindex all the drug pages the hack had created. Still 2 weeks later he is sub 100. Does anyone else know of any way to push this along faster? I wish there was some way to get Google to recognize its fixed faster as his business is destroyed.
Intermediate & Advanced SEO | | iAnalyst.com0 -
Why Would This Old Page Be Penalized?
Here's an old page on a trustworthy domain with no apparent negative SEO activity according to OSE and ahrefs: http://www.gptours.com/Monaco-Grand-Prix They went from page 1 to page 13 for "monaco grand prix" within about 4 weeks. Week 2 we pulled out all the duplicate content in the history section. When rank slipped further, we put it back. Yet it's still moving down, while other pages on the website are holding strong. Next steps will be to add some schema.org/Event microformats, but beyond that, do you have any ideas?
Intermediate & Advanced SEO | | stevewiideman0 -
Can Linking Between Your Own Sites Excessively Be a Penguin No-No?
I have a bunch of travel-related sites that for a long time dominated google.com.au without any intensive SEO whatsoever. Aside from solid on-page content and meta tag, I did no link building. However, all of my sites are heavily interlinked, and I think they are linked with do follow links and lots of anchor texts. Here are a few of them: www.beautifulpacific.com www.beautifulfiji.com www.beautifulcooklands.com My idea in inter-linking them was to create a kind of branded "Beautiful" nexus of sites. However, when Penguin hit -- which I believe was on April 27th -- search traffic crashed, and has crashed over and over again. I've read that Penguin penalized over-optimization vis a vis anchor text links. I don't have a lot of inbound links like these, but they are everywhere among my sites. Is it possible that all of my text links have hurt me with Penguin? Thanks to everyone in advance for your time and attention. I really appreciate it. -Mike
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
Has there been a 'Panda' update in the UK?
My site in the UK suddenly dropped from page 1 and out of top 50 for all KWs using 'recliner' or a derivative. We are a recliner manufacturer and have gained rank over 15 years, and of course using all white hat tactics. Did Google make an algo update in the Uk last week?
Intermediate & Advanced SEO | | KnutDSvendsen0 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280