How to deal with DMCA takedown notices
-
How do you deal with DMCA takedown notices related to product descriptions? With Google it is simple enough for any person to submit a DMCA takedown notice irrespective if the owner holds right to the content.
One such example is this http://www.chillingeffects.org/notice.cgi?sID=1012391. Although Google dealt in that particular case properly (and did not remove content), we find that nowadays more and more competitors use the DMCA takedowns as an easy way to de-index competitive content.
Since the person registering the DMCA takedown does not require to provide any proof of copyright, de-indexing happens quite quickly.
Try this URL: http://www.google.com/transparencyreport/removals/copyright/domains/mydomain.com/ (replace your domain) to see if you have been affected.
I would like your opinion if you have been affected by takedowns on product descriptions - in my mind if product descriptions are informative and relate to the characteristics of the product then takedowns should be denied.
-
luckily I have never had that problem, also it seems that you will be alerted in you webmaster account if someone requests a takedown.
Still an interesting "black hat" technique to remove competitors out of the serps that might not have webmaster accounts
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I added an SSL certificate this morning and now I noticed duplicate content
Ok, so Im a newbie, therefor I make mistakes! Lots of them. I added an SSL certificate this morning bc it was free and I read it can help my rankings. Now I just checked it in screaming frog and saw two duplicate content pages due to the https. So im panicking! What's the easiest way to fix this?? Can I undue an SSL certificate? I guess what's the easiest that will also be best for ranking. Thank you!! Rena
Intermediate & Advanced SEO | | palila0 -
How to deal with canonicals on dup product pages in Opencart?
So I have a seriously large amount of duplicate content problems on my Opencart site, and I've been trying to figure out the best way to fix them one by one. But is there a common, easy way of doing this? Because frankly, it is a nightmare otherwise. I bought an extension which doesn't appear to work (http://www.opencart.com/index.php?route=extension/extension/info&extension_id=20468&utm_source=ordercomplete&utm_medium=email&utm_campaign=wm), so now I'm at a loss.
Intermediate & Advanced SEO | | moon-boots0 -
[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
Good day to you Mozers, I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced. This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified. Examples are: Every PoS could have a different price on their product Some of them have services available and some may have fewer, but the content on these service page doesn't change. I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag. i.e: https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential. Question is:
Intermediate & Advanced SEO | | Charles-O
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one. I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on? I am worried about: Indexation, Accessibility and crawl budget being wasted. Thank you in advance,1 -
Scraped Content on Foreign Language Site. Big deal or not?
Hi All, I've been lurking and learning from this awesome Q&A forum, and I finally have a question. I am working on SEO for an entertainment site that tends to get scraped from time to time. Often, the scraped content is then translated into a foreign language, and posted along with whatever pictures were in the article. Sometimes a backlink to our site is given, sometimes not. Is scraped content that is translated to a foreign language still considered duplicate content? Should I just let it go, provided a backlink is given? Thanks!
Intermediate & Advanced SEO | | MKGraphiques
Jamie0 -
Dealing with Redirects and iFrames - getting "product login" pages to rank
One of our most popular products has a very authoritative product page, which is great for marketing purposes, but not so much for current users. When current users search for "product x login" or "product x sign in", instead of getting to the login page, they see the product page - it adds a couple of clicks to their experience, which is not what we want. One of the problems is that the actual login page has barely any content, and the content that it does carry is wrapped around <iframes>. Due to political and security reasons, the web team is reluctant to make any changes to the page, and one of their arguments is that the login page actually ranks #1 for a few other products (at our company, the majority of logins originate from the same domain). </iframes> To add to the challenge - queries that do return the login page as #1 result (for some of our other products) actually do not reference the sign-in domain, but our old domain, which is now a 301 redirect to the sign-in domain. To make that clear - **Google is displaying the origin domain in SERPs, instead of displaying the destination domain. ** The question is - how do we get this popular product's login page to rank higher than the product page for "login" / "sign in" queries? I'm not even sure where we should point links to at this point - the actual sign in domain or the origin domain? I have the redirect chains and domain authority for all of the pages involved, including a few of our major competitors (who follow the same login format), and will be happy to share it privately with a Moz expert. I'd prefer not to make any more information publicly available, so please reach out via private message if you think you can help.
Intermediate & Advanced SEO | | leosaraceni0 -
Canonical tag: how to deal with product variations in the music industry?
Hello here. I own a music publishing company: http://www.virtualsheetmusic.com/ And we have several similar items which only difference is the instrument they have been written for. For example, look at the two item pages below: http://www.virtualsheetmusic.com/score/Canon2Vl.html http://www.virtualsheetmusic.com/score/Canon2Vla.html They are the exact same piece of music, but written in a different way to target 2 different instrumental combinations. If it wasn't for the user reviews that can make those two similar pages different, Google could see that as duplicate content. Am I correct? And if so, how do you suggest to tackle such a possible problem? Via canonical tags? How? To have a better idea of the magnitude of the problem, have a look at these search results on our site which give you product variations of basically the same piece of music, the only difference is in the targeted instruments: www.virtualsheetmusic.com/s.php?k=Canon+in+D www.virtualsheetmusic.com/s.php?k=Meditation www.virtualsheetmusic.com/s.php?k=Flight And, similarly, we have collections of pieces targeting different instruments: www.virtualsheetmusic.com/s.php?k=Wedding+Collection www.virtualsheetmusic.com/s.php?k=Christmas+Collection www.virtualsheetmusic.com/s.php?k=Halloween+Collection Any thoughts and suggestions to tackle this potential page duplication issue are very welcome! Thank you to anyone in advance.
Intermediate & Advanced SEO | | fablau0 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180 -
Some things I have noticed SEO which showed a positive
Hi, Just a few small things I have noticed which has made a difference in ranking, thought I would share, small things but every little helps. Looks like more content on a page the better the ranking, in a test I have done this is what I found: page with 0 content - Google page 11 page with 250 content - Google page 5 page with 500 content - Google page 2 page with 1000 content - Google page 1, position 3 Looks like Google might count the words on a page (like you can do in Microsoft word) and then apply a score against it. Tweets and likes (78) showed a jump from page 1, position 8, to page 1, position 1 (this was the only form of link building and no changes made apart from adding the share plugin). Kind of makes me think if your starting a new site or page then add 750 - 1000 words of good content (tutorial or whatever) with a social sharing tool and let this run will give you a better chance of ranking when you start to introduce products and services.
Intermediate & Advanced SEO | | activitysuper0