Website 'stolen', no contact details
-
Hi all,
Wondering if anyone could help out here, good a very strange issue....
Went into Google Webmaster Tools and looked at the incoming links to a client's site (new client, only just gained access to WMT) and noticed 2563 links coming from a domain. Upon viewing said domain it is a 100% copy of the clients site, I mean 100%; the phone numbers, email address etc are still pointing to the client's site.
Everything is the same, the pages, the navigation etc. When I click on a link on the copy site it loads the same pages but at their site, the internal linking points to the version of the pages on their site. It seems to be an ongoing thing because the last time the client updated their blog was last week and this is on the copy site.
Obviously this cannot be helping with regard to seo. The client knows nothing about it so not come from them. The copy site is indexed in Google!!.
The first thing to do is to contact these people and ask what they are doing. This is proving to be easier said than done, the contact details (as mentioned above) on the pages still point back to the client and the whois gives no details.
What would be the first step to take here? Obviously there is the whole legal area about stolen content but that can wait until we have the site down and out of Google. Is there somewhere in Google to report things such as this?
I will speak to client and if they are happy I will share both the domains in question, they know I am seeking alternative opinions
Many thanks
Carl
-
Thanks for the reply. We went down the route of blocking the other domains from accessing the server in the end. Hopefully the duplicate versions of the website won't cause too much trouble.
One thing am considering is adding the domains to webmaster tools and removing them from google, that should help with duplicate content issues. If they are pointing to our server and accessing our files then we may as well exploit that for a webmaster tools verification action
-
Grumpy C,
Though this is new to you I can assure you that it is a VERY common issue we find in the SEO world. In other words, get used to it.
A cross-domain rel canonical tag should fix you right up, but in the long run I'd look into 301 redirecting or just removing those other domains:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
http://www.seomoz.org/blog/rel-confused-answers-to-your-rel-canonical-questions
We used to call these "mirrored domains" which may help you when searching for more information on them. One thing I find useful in locating mirrors is called a "Reverse IP Lookup". You can find free tools all over the place for this. Here's one: http://www.yougetsignal.com/tools/web-sites-on-web-server/ . Before you get all freaked out when using this tool remember that shared hosting is very common so having multiple sites on the same IP is fine. What is not fine is if there are several of the SAME sites out there, as you are now dealing with.
Good luck!
-
Thanks will look into that. Would be so much easier if this client owned all the domains, they all seem to be owned by different people and not linked to each other in any way
-
We had exactly the same issue with a client. They had simply re-hosted their site under at least 10 other domain names they owned in the misguided perception that this would improve their SEO.
They were all hosted on the same server as your latest response seems to state, and we tracked them all down by doing a reverse IP lookup using this tool.
We then had to ammend the clients DNS records for the duplicate sites and 301 them to the true site. The same site later recieved a link penalty and we then parked those domains ratyher than redirected.
-
Thanks for the responses, everyone. The situation gets even more random. It would appear that the content is not stolen, but rather the 'copy' domain (and indeed two more, at least) are not only pointing to client's server but also the same directory as their site. The are all loading the same files!!! Must admit, this is a new one to me.
The client IT dept claim so far...they purchased a new ip for their server and are using that. The previous ip used to belong to another host, so it appears, and there were sites pointed to that ip. When the ip was moved to client's site the sites pointing to it now point to the new server. This is just about understandable, how these domains are accessing the files on the server is a mystery. It's a windows server so not my area of expertise.
Oddly enough the two domains which seem to have had their server moved are registered for another 3 years still, so one would assume they are wanted domains, whoever owns them
It's definitely been an interesting Tuesday morning so far!!! Still the afternoon to come so I wonder how many more websites can find sharing client's files!!
Not sure where this leaves me now with regard to calling it spam/theft. All the domains appear to be resolving to exactly the same place,yet only one of them is owned by the client.
-
you could also contact the hosting company of the site, they might take down the site for you, or aleast contact the fake site owner
-
Hi Carl,
Auch, this is probably not a use case you ever wanted to fix for you clients. However I would suggest filing a DCMA request based on the copyright of your texts/ images used on your client site. This will, hopefully, at least remove the copied site out of Googles index.
Filing such a request could be done here: http://support.google.com/bin/static.py?hl=en&ts=1114905&page=ts.cs
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need suggestion: What is the best internal linking structure for our website to gain in SEO & UX too?
Hi all, We have 3 different editions of our product we are selling with 20 features. 1st edition & 2nd edition comes with 15 features in which 10 are common in each edition. 3rd edition comes with all 20 features. Now what's the best way to interlink and show the navigational menu to highlight 3 editions and features as well? Much appreciated if some one refer me a website with such structure. Thanks
Web Design | | vtmoz0 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
How does Google's AJAX Announcement Impact the likes of AngularJS?
Google's announcement last month about depreciating their AJAX crawl directive and Distilled's recent article have got me thinking a lot about how this change impacts frameworks like AngularJS. For those of you that use or are considering using frameworks like AngularJS, does this change impact you? Has it changed your mind about services like Prerender etc? All discussions relating to AJAX crawling welcome. Some resources to get started: https://prerender.io/js-seo/angularjs-seo-get-your-site-indexed-and-to-the-top-of-the-search-results/ https://www.distilled.net/resources/prerender-and-you-a-case-study-in-ajax-crawlability/
Web Design | | ecommercebc1 -
What are the most common reasons for a website being slow to load
I've been advised that too many requests are being sent (presumably to the server?), how can I reduce these and were else should I look to increase speed?
Web Design | | FBS1 -
Best practice for product detail when all products are onepage
HI there,
Web Design | | Hephey
I have a page utilizing isotope with multiple products with small text excerpts and when you click an item i opens a detailed view without requiring a new page load. I've read some of the one page posts but can't get my head around what's best SEO wise when dealing with possible duplicate content. I guess one method could be to have the product list with small excerts of text and all the details hidden in some json and then when the user clicks it, it will open the product and fill with details from inline json. The click action is overring the a tag action e.g. with jquery, so the the a tag has a clean url to a proper subpage with meta, h1 and all that stuff so google can follow it. The jquery thing enables the navigation without a page reload and I can update the document url with pushState.
The subpage, if visited directly, includes the same animation stuff as the master but now has h1, p meta specific to that product but still with same effect, navigation and layout as the master page. Does anybody know if there is a better way to do this with one page sites when wanting to seo optimize detailed contents?0 -
Best course of action when removing 100's of pages from your site?
We had a section on our site Legal News (we are a law firm). All we did there was rehash news stories from news sites (no original content). We decided to remove the entire Legal News section and we were left with close to 800 404's. Around this same time our rankings seemed to drop. Our webmaster implemented 301's to closely related content on our blog. In about a weeks time our rankings went back up. Our webmaster informed us that we should submit each url to Google for removal, which we did. Its been about three weeks and our Not Found errors in WMT is over 800 and seems to be increasing daily. Moz's crawler says we have only 35 404's and they are from our blog not the legal news section we removed. The last thing we want is to have another rankings drop. Is this normal? What is the best course of action when removing hundreds of pages from your site?
Web Design | | MFC0 -
What's the best was to structure Product page information on my site?
Hi - I run a hobby related niche new / article / resource site (http://tinyurl.com/4eavaj4). One of the most critical components of the site is our product database. We don't actually sell anything directly - instead we monetize them by displaying relevant affiliate product feeds and price comparisons. However since the Panda update was implemented in February my traffic (particularly my long tail, product related traffic) has dropped off considerably. I had about a 20% drop in overall traffic, but have made up some of the ground in the past week. However I want to know once and for all how I should structure my product related information as I have a ton of great content that is ready to be published in this section but want to be sure I structure it the best possible way from a SEO standpoint. Here are a few different options I've come up with for displaying information about products on my site. For the purpose of these examples I am going to refer to all of the information that makes up my product pages collectively as "product profiles". Please let me know which is the best SEO wise (or if you have a better way of doing it let me know): - Option 1 - Current Method - Divide Content Sections into different pages / urls Example: http://tinyurl.com/4tpdlbl This is how the majority of my product profiles are currently structured. I did this to improve load times and to keep the total number of links per page down. In addition to the core product profile subpages: "Product Details","Compare Prices", **"**Product Review", "Hot Auctions", and "Checklists", I have the Checklists area further segmented by subset, each of which is on its own page that is only accessible through the main Checklists tab of the profile. - Option 2 - Everything on one url / page the old fashioned way, with everything available by scrolling vertically. This would make the page go on forever though. - Option 3 - Everything on one url / page, but visually segmented using css / javascript tabs. Example: http://tinyurl.com/4kqhauh I looked at the source code and all the page text is there, so it looks like it would be spider-able but you tell me. Or would another method of tabbing be better? My site is wordpress based so the functionality comes from a plugin. - Option 4 - Use post tabs that are technically all on the same page, but make each individual tab be accessible through its own suburl, all of which share the same core canonical url. Example: http://tinyurl.com/4bs9pjs Clicking on any of the individual tabs will result in something like ?postTabs=2 being appended to the core url. Example: http://tinyurl.com/4gvgufc Any input would be greatly appreciated asap! Thanks Mike
Web Design | | MikeATL0 -
What's the best SEO option for jQuery image carousels?
My client wants a fancy jquery carousel at the top of their home page, as is all the rage these days. I would like to add some nice SEO friendly text to that carousel, but I'm not sure how best to do that..I assume that by keeping the text which will appear in the carousel in divs on the page, which will be swapped out as the images cycle, it should still be easily picked up by search engines?
Web Design | | TroyCarlson1