Are there SEO implications to blocking foreign IP addresses?
-
We are dealing with a foreign company that has completely ripped off our entire site template, design and branding. This is such an inconvenience and we've had similar things happen enough in the past that we're considering blocking large ranges of IP addresses from accessing our site, via htaccess.
Is this something that will potentially cause problems with search engine bots crawling or indexing our site? We are in the US and our site is hosted in the US, but I'm not sure if the major search engines could potentially be using foreign based bots.
Looking for any insight on this or if there are any other potential SEO problems to consider.
Thanks
-
Zee, did you implement this? Outcomes?
-
If the bot is in another country and you have blocked the range it's pretty obvious... What kind of "backup" are you looking for?
If you are asking me if I have a geographical list of bots for each search engine then no, I don't. But this might be of some use to you http://productforums.google.com/forum/#!topic/webmasters/TbpNyFiJvjs
Good luck with the whole site design / copyright issue, any chance you could PM me a link I would like to see what they have done... (just curious).
-
Thanks for the reply SEOKeith, but focusing "on making our site more authoritative" does not solve the problem.
The problem we have is not an SEO problem, it's a design, copyright, trademark and ethical problem. When you spend months developing and designing a site only to have it ripped off, it's not something we want to just ignore.
The damage has been done in this particular instance. However, we've had enough problems in the past from foreign visitors and our business doesn't come from foreign countries. Because of that, blocking actual humans from accessing our site from countries we've had problems with is a potential solution.
The solution we're considering could potentially impact the way search engines view our site and that's the question. Do you have anything to back up your comment about "blocking large ranges of IP addresses you could end up restricting access to legitimate...bots"?
-
By blocking large ranges of IP address you could end up restricting access to legitimate users, bots etc.
For a start how do you even know what site is harvesting your data is in the said country, sure they might be hosting there but the boxes that are ripping your content might be in the US they could then have some web heads in some other random countries serving up the content.
People copying / stealing / cloning your content is pretty common it happens to a lot of my sites - it's just the way it is your not going to be able to stop it you might as well just focus on making your site more authoritative.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Technical URL SEO question
Hi All, We sell a product on our site which is displayed in cubic metres, from an SEO perspective is it ok to have /3m³ in the URL or should I use 3m3. Thanks All
Technical SEO | | Redooo0 -
Are on-site content carousel bad for SEO?
Hi, I didn't find an answer to my question in the Forum. I attached an example of content carousel, this is what I'm talking about. I understand that Google has no problem anymore with tabbed contents and accordeons (collapsible contents). But now I'm wondering about textual carousels. I'm not talking about an image slider, I'm talking about texts. Is text carousel harder to read for Google than plain text or tabs? Of course, i'm not talking about a carousel using Flash. Let's say the code is proper... Thanks for your help. spfra5
Technical SEO | | Alviau0 -
Impact of Non SEO Subdomains
My company has several subdomains whose specific purpose is to act as a landing page/site for our paid search and/or email program. One of the things I've noticed on these subdomains is that they are not being excluded from the SEObots. Could the lack of proper SEO techniques on these subdomains impact our main www subdomain? What is the proper configuration we should use to make sure these sites are not considered for SEO?
Technical SEO | | APFM0 -
GWT Change of Address Keeps Failing
Followed Google's instructions for using the Change of Address Tool in GWT to move rethinkisrael.org to www.fromthegrapevine.com. I'm getting this message, "We tried to reconfirm ownership of your old site (rethinkisrael.org) and failed. Make sure your verification token is present, and try again."Even though the site is verified, we undid the DNS change, and checked the meta verification tag. The tag is correct. And, since the site is ALREADY verified there was NO way to 'veryify' in GWT again. The message in GWT says "verification successful."We redid the DNS change, tried again to do the address change and get the same error message. Any ideas?
Technical SEO | | Aggie0 -
Site Launching, not SEO Ready
Hi, So, we have a site going up on Monday, that in many ways hasn't been gotten ready for search. The focus has been on functionality and UX rather than search, which is fair enough. As a result, I have a big list of things for the developer to complete after launch (like sorting out duplicate pages and adding titles that aren't "undefined" etc.). So, my question is whether it would be better to noindex the site until all the main things are sorted before essentially presenting search engines with the best version we can, or to have the site be indexed (duplicate pages and all) and sort these issues "live", as it were? Would either method be advisable over the other, or are there any other solutions? I just want to ensure we start ranking as well as possible as quickly as possible and don't know which way to go. Thanks so much!
Technical SEO | | LeahHutcheon0 -
4 websites on same IP crosslinking in footer
Hello, we have 3 separate websites and domains. Industry directory websites Industrialdomain.com DA:62, 8yrs old Medicaldomain.com DA:45, 8yrs old and Hosptalitydomain.com DA:24, 1yr old These sites are cross linked site-wide via footer links and the sites flow a substantial amount of well converting traffic between each other. How does Google see this? Could we place the links in a better position than the footer? The links should be site-wide as we receive many deep visits to these sites from organic and PPC sources. Our traffic is down 20% post penguin, we are going through the backlinks now and weeding out some iffy links but the site has never been forum spammed or anything even close to that. One other thing is that they are linked to with anchor text type links
Technical SEO | | JeremyNathan
For example,
Hospitality Equipment
Medical Supplies & Devices
Industrial Directory It may be best to start of just linking with an image.0 -
SEO Terms for Internal Vs External
Hey there! I am writing up an SEO plan for our company and wanted to get the groups input on the use of some SEO terms. I need to organize and explain these efforts to nonSEO people. I usually talk about, SEO in terms of "Internal" vs "External" efforts. Internal SEO efforts being things like Title Tags, Description Tags, Page Speed, Minimizing errors, proper 301 redirect, content development for the site, internal linking and anchor, etc. External SEO efforts being things like Link building, social media profile setups and posts (FB Twitter Pinterest, YouTube), PR work. How do you split these out? What terms do you use? Do you subdivide these tasks? What terms do you use? For example, with Internal, I sometimes talk about "Technical SEO" that has do to with making sure that site speed is working well, 301s are setup correctly, noindex tag etc are all used properly. These are things that different versus "On Page" efforts to use keywords properly etc. I will also use the term "Site Visibility" for non SEOs to explain the technical impact. For example, if your site has the wrong robots.txt, if you have 500 errors everywhere and a slow site, if you are sending spiders down a daisy chain of 301s, it is difficult for the key parts of your site to be found and so your "Visibility" to the engines are poor. You have to get your visibility up, before you begin to then worry about if you have the right keywords on a page etc. Any input or references would be appreciated.
Technical SEO | | CleverPhD0 -
Name Servers & SEO
We have decided to create a few blogs and will eventually be linking to some of our clients. I have domain privacy and different class C addresses for each of my domains. But the name servers area all the same. Ex: If we create an article for one client on all 5 blogs, will the name servers be a problem?
Technical SEO | | waqid0