How can do I report a multiple set of duplicated websites design to manipulate SERPs?
-
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted.
All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do.
None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening.
Anyway, all of the domains have been registered by the same person and within a two-month time period of each other.
What do you guys think is the best step to take to report these particular websites to Google?
-
Hey Sha,
Thanks for that.
It would seem that would be the best bet.
-
Hi Matthew,
As Mark suggested, create a spreadsheet in Google Drive and list all of the domains/URLs in the spreadsheet.
Add a single URL in the first field of the Spam Report form (the required field), then in the third field, provide a link to the spreadsheet (300 character max)
Something like: "I believe the sites listed in the googledoc at URL are using manipulative linking practices to influence search engine rankings"
Do be confident though, as Mark warned, that your client's site is able to withstand any scrutiny that might come with a review of sites in the niche.
Hope that helps,
Sha
-
Sorry, I don't quite follow. How would this work?
-
I would try creating an open Google doc, and then listing all of the sites in the network. Kind of like the reconsideration request method, where you link to an open Google doc with all of the details of the webmasters you contacted, responses, success rate, etc
-
Hey Mark,
Thanks for your response.
I'm aware of that link but it only allows you to submit one link at a time. In this case, there are tons of domains that I'd have to submit. So, is there anyway to submit multiple sites?
-
Hi Matthew,
Google has a specific form for reporting webspam - you can find the spam report here - https://www.google.com/webmasters/tools/spamreportform?hl=en
Before you submit a competitor, make sure your own site/s are clean - don't want them looking to closely into your SERPs and your sector and finding a problem with you as well.
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
Wordpress Category Archives - Index - but will this cause duplication?
Okay something I am struggling with Using YOAST - but have a recipe blog - However the category archives have /are being optimized and indexed as I am adding custom content to them , then listing the recipes below. My question is if I am indexing the Category Archives and using these to add custom content above - then allows the recipe excerpts from the category to be listed underneath - will these recipe excerpts be picked up as duplicate content?
White Hat / Black Hat SEO | | Kelly33300 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
What are legit ways to raise up you're ranking for a new website?
I have a wallpaper website that i just made and bought a template that looks fine for the site so far for a month now, and i wanted to know what steps i cant take to better rank my site and build some traffic along the way. I use only specific directories, not sure how to get a press release done and also link back to other sites from pages that get a decent amount of traffic where i can leave a link to it, of course not leaving any type of spammy looking comments. This is the site i am working on right now, freehdwallpapers.be I have linked back from a few sites already, i look at the alexa rank if it will show a number at one point, the sites worth is still pretty low, and also i have added social networks on the site which has gained a number of followers to this day, so i got work to do still. I just don't want to go on about it the wrong way and get penalized by google.
White Hat / Black Hat SEO | | 1080HDWallpapers0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
1 business targetting multiple local locations
When researching a new client - I just came across a site in the same field which is ranking really well for all the local towns/cities/villages in the area. Each page for each town is a duplicate only changing out the town name (which appears 13 times on the page) - all pics and videos are the same. His url structure is along the lines of: budget-business-domain.com/budget-business-area/budget-business-town/ The domain was registered in 2012 - all backlinks are internal - anchor text is the same. I think it shouldn't be working.... but it is 😞 Why is this working?
White Hat / Black Hat SEO | | agua0 -
SERPs recovery? When can I believe it?
Here's a happy story: Some of you folks with sharp memories may remember my questions and worry over the last 3+ months regarding our fall into the abyss on Google after great positions for over a decade (we've always been fine in Bing and Yahoo). And our company name URL was still #1 so no site-wide penalty. Well......I've been working hard on fixing this in a smart way with all the ingredients I've been learning about. Thank you to SEOMozers for all the help!! There's still plenty to do, especially in the link earning department, but I've come really far from where I was in the Fall. Anyway. I am here right now to report what may be true to life fantastic news. I was starting to suspect an improvement last week, but it proved to be wrong. Then, I saw another sign yesterday but couldn't trust it. Today, my latest SEOMoz report is showing me the following for the several keywords we lost position down to "not in the top 50" for. keyword 1: up 44 points to #6keyword 2: no change still at #4
White Hat / Black Hat SEO | | gfiedel
keyword 3: up 46 points to # 4
keyword 4: up 43 points to #7
keyword 5: up 46 points to #4
keyword 6: up 2 points to #2 What I'm wondering is if this is real. ;o). I'm pinching myself. I realize that it could be one of those sliding readjustment things and we'll drop back down, but we are not a new site. It seems that even if that is the case, it still must illustrate something good. Some kind of elimination of possibilities for why the drop occurred in the first place. I did a few things in this past week that may have put it over the tipping point. One of which was signing up for adwords a week ago. I'm happy to give details if anyone is interested. A few specific questions: 1. What might this be showing me?
2. We have about a 45% number of anchor text footer links in client sites (we're a web dev co) one or two of which are numbering in the hundreds have keywords in them and are continuing to generate more links due to ecomm and large databases. I was gearing up to remove them or get them moved out of the footer so there's only one, but now I'm afraid to touch anything. Most of the footer links are just our company name or "site design". Any suggestions? 3. any other bits of advice for this situation are appreciated. I don't want to blow it now! Thanks!0 -
How Can I Check Competitors Linking Profile?
If I'm looking for weak points in my competitors linking structure, how can I use Open Site Explorer to do that? In other words, I'm not sure how to use Open Site Explorer? Zane
White Hat / Black Hat SEO | | Springboks0