Are there SEO implications to blocking foreign IP addresses?
-
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess.
Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots.
Looking for any insight on this or if there are any other potential SEO problems to consider.
Thanks
-
Zee, did you implement this? Outcomes?
-
If the bot is in another country and you have blocked the range it's pretty obvious... What kind of "backup" are you looking for?
If you are asking me if I have a geographical list of bots for each search engine then no, I don't. But this might be of some use to you http://productforums.google.com/forum/#!topic/webmasters/TbpNyFiJvjs
Good luck with the whole site design / copyright issue, any chance you could PM me a link I would like to see what they have done... (just curious).
-
Thanks for the reply SEOKeith, but focusing "on making our site more authoritative" does not solve the problem.
The problem we have is not an SEO problem, it's a design, copyright, trademark and ethical problem. When you spend months developing and designing a site only to have it ripped off, it's not something we want to just ignore.
The damage has been done in this particular instance. However, we've had enough problems in the past from foreign visitors and our business doesn't come from foreign countries. Because of that, blocking actual humans from accessing our site from countries we've had problems with is a potential solution.
The solution we're considering could potentially impact the way search engines view our site and that's the question. Do you have anything to back up your comment about "blocking large ranges of IP addresses you could end up restricting access to legitimate...bots"?
-
By blocking large ranges of IP address you could end up restricting access to legitimate users, bots etc.
For a start how do you even know what site is harvesting your data is in the said country, sure they might be hosting there but the boxes that are ripping your content might be in the US they could then have some web heads in some other random countries serving up the content.
People copying / stealing / cloning your content is pretty common it happens to a lot of my sites - it's just the way it is your not going to be able to stop it you might as well just focus on making your site more authoritative.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One-Pager and SEO
We're building a page that is going to feature over 31 people as difference makers in their field. We're unveiling one a day for an entire month. The very early mockup of the page has name, pic, some bio info, and a link to open up a new window with the full bio. I would love to have all of the bio content for all of the people on the page (and indexable), but I'm not sure how to do that while still being able to hide the full bios until they are expanded. Anybody have any tips that are SEO-friendly and/or examples of a page that is built like this and ranks well. Thanks!
Technical SEO | | spackle0 -
Recovering from Blocked Pages Debaucle
Hi, per this thread: http://www.seomoz.org/q/800-000-pages-blocked-by-robots We had a huge number of pages blocked by robots.txt by some dynamic file that must have integrated with our CMS somehow. In just a few weeks hundreds of thousands of pages were "blocked." This number is now going down, but instead of by the hundreds of thousands, it is going down by the hundreds and very sloooooowwwwllly. So, we really need to speed up this process. We have our sitemap we will re-submit, but I have a few questions related to it: Previously the sitemap had the <lastmod>tag set to the original date of the page. So, all of these pages have been changed since then. Any harm in doing a mass change of the <lastmod>field? It would be an accurate reflection, but I don't want it to be caught by some spam catcher. The easy thing to do would be to just set that date to now, but then they would all have the same date. Any other tips on how to get these pages "unblocked" faster? Thanks! Craig</lastmod></lastmod>
Technical SEO | | TheCraig0 -
How can i do SEO For Ecommerce site
I am doing SEO for my WP blog but now I am starting my recently launch an eCommerce site where I am selling electronics products. I want to know how can I do the SEO so at least I can top 10 position for my google India. Second how can i avoid duplicate content about copying manufacture contents. Please help
Technical SEO | | chandubaba0 -
Questionable SEO
Chess Telecom appears first when you search for 'business phone lines' in the UK so I used a campaign to check them out. It seems they've got tons of unrelated links and using comment spamming to increase their ranking. Along with fake twitter accounts and other things. Search for 'jewel jubic chess' and you'll see what i mean. I assumed this wasnt a good idea and been trying to get my link on relevant websites only. Any comments or suggestions? Should I simply trust that google will hopefully punish them eventually? Or should I be fighting fire with fire? Thanks Dan
Technical SEO | | DanFromUK0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Differences in Sitemaps SEO wise?
I'm a bit confused about sitemaps. I'm just learning SEO so forgive me if this is a basic question. I've submitted my site to google webmaster using http://pro-sitemaps.com and the sitemap generator it creates. I've also seen sites do this: http://www.johnlewis.com/Shopping/ProductList.aspx and http://www.thesafestcandles.com/site-map.html so I did something similar for my site (www.ldnwicklesscandles.com). You figure you see everyone do it you might as well try it too and hope it works. 😉 So I've done both 1 and 2. Which sitemap is best for SEO purposes or should I do both? Is there any format that should or shouldn't be used for Option 2? Any site examples for good practice would be helpful.
Technical SEO | | cmjolley0 -
URL content format - Any impact on SEO
I understand that there is a suggested maximum length for a URL so as not to be penalized by search engines. I'm wondering if I should if should optimize our ecommerce categories to be descriptive or use abbreviations to help keep the URL length to a minimum? Our products are segmented into many categories, so many products URL's are pretty long if we go the descriptive route. I've also heard that removing the category component entirely from a product URL can also be considered. I'm fairly new to all this SEO stuff, so I'm hoping the community can share their knowledge on the impact of these options. Cheers, Steve
Technical SEO | | SteveMaguire0 -
Html5 in SEO
What is the convinience of using html5 for seo.As i read is not too good using many h1 in each metacontent (due to crawler alerts) , but it is good to use html5. We have follow or so this web guidelines www.tumanitas.com whtat do you think about taht?
Technical SEO | | ofuente0