Why are these blackhat sites so successful?
-
Here's an interesting conundrum. Here are three sites with their respective ranking for "dental implants [city]:"
http://dentalimplantsvaughan.ca - 9 (on google.ca)
http://dentalimplantsinhonoluluhi.com - 2 (on google.com)
http://dentalimplantssurreybc.ca - 7 (on google.ca)
These markets are not particularly competitive, however, all of these sites suffer from:
- Duplicate content, both internally and across sites (all of this company's implant sites have the same exact content, minus the bio pages and the local modifier).
- Average speed score.
- No structured data
- No links
And these sites are ranking relatively quickly. The Vaughan site went live 3 months ago.
But, what's boggling my mind is that they rank on the first page at all. It seems they're doing the exact opposite of what you're supposed to do, yet they rank relatively well.
-
Not exactly. When it comes to different countries, like the example domains you listed above. (.com and .ca) Google allows for mirrored or duplicate sites by country.
When it comes to multiple sites in the same country, Google will give value to the first use of the content, then no value to the 2nd use. In the example you gave of San Diego and Atlanta, it is important to create unique content, citations and backlinks that are localized to that site's location.
I have a client that has two separate appliance companies in the same area and two separate websites. I've used some of the same general content on both, but have content that is unique to both, along with unique links, and they both rank really well.
-
I guess what baffles me is that there's duplicate, spammy content. Exactly what Google tells you to stay away from.
-
So, suppose a site is #1. It's for a bakery in Atlanta, Georgia. The content is doing really well. You're telling me that, as a bakery in San Diego CA, I can take that content, slap it on my site, replace the business name and location information, and it'd be okay?
-
Yes, if they are different businesses, they should be treated differently.
-
Even if those sites are for different practices/businesses?
-
I have similar sites I have reviewed. It comes down to the - exact marching url. Plus the space has to be not overly competitive. No doubt there is 600 other factors but the dominant standout is the url. Ironically once there tough to dislodge, unless the site dislodging is 10 x 's better - cannot be just twice as good... .
-
Not a bad looking site. Google does allow for duplicate content across multi-regional sites. And sometimes a new site will get an initial boost then drop back down. Also, if there is not a lot of localized website competition, Google will rank them as the most relevant for this category in Hawaii.
-
Interesting: view-source:http://dentalimplantssurreybc.ca/faqs/
view-source:http://dentalimplantssurreybc.ca/faqs/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved URL dynamic structure issue for new global site where I will redirect multiple well-working sites.
Dear all, We are working on a new platform called [https://www.piktalent.com](link url), were basically we aim to redirect many smaller sites we have with quite a lot of SEO traffic related to internships. Our previous sites are some like www.spain-internship.com, www.europe-internship.com and other similars we have (around 9). Our idea is to smoothly redirect a bit by a bit many of the sites to this new platform which is a custom made site in python and node, much more scalable and willing to develop app, etc etc etc...to become a bigger platform. For the new site, we decided to create 3 areas for the main content: piktalent.com/opportunities (all the vacancies) , piktalent.com/internships and piktalent.com/jobs so we can categorize the different types of pages and things we have and under opportunities we have all the vacancies. The problem comes with the site when we generate the diferent static landings and dynamic searches. We have static landing pages generated like www.piktalent.com/internships/madrid but dynamically it also generates www.piktalent.com/opportunities?search=madrid. Also, most of the searches will generate that type of urls, not following the structure of Domain name / type of vacancy/ city / name of the vacancy following the dynamic search structure. I have been thinking 2 potential solutions for this, either applying canonicals, or adding the suffix in webmasters as non index.... but... What do you think is the right approach for this? I am worried about potential duplicate content and conflicts between static content dynamic one. My CTO insists that the dynamic has to be like that but.... I am not 100% sure. Someone can provide input on this? Is there a way to block the dynamic urls generated? Someone with a similar experience? Regards,
Technical SEO | | Jose_jimenez0 -
PWA for Desktop Site (Ecommerce)
Hi Folks, Need guidance about using PWA on desktop site. As I know PWA is basically used for mobile site to engage visitor more and let them surf your site like an app. Would it be good SEO practice to use PWA on desktop site(E-commerce site) by calling everything through Javascript and let google Crawler cache only site logo and Hide everything else?
Technical SEO | | Rajesh.Prajapati1 -
Why is this site ranking higher?
We've put a fair bit of effort into delivering value here: https://lewescountycars.co.uk/ But a search for "Lewes taxis" or "taxis Lewes" puts this site above us: https://www.lewestowntaxis.co.uk/ As you can you see, this is a tiny site that we outperform in most ways.... what can we do to rank above it that we haven't already done? Thanks in advance - Gerard.
Technical SEO | | Paul7301 -
Discontinued Product on a Ecommerce site
To create a better customer experience, rather then remove discontinued product from a site, we remove many links from the page, and remove it from the navigation of the site, but we keep the url and show that the product can no longer be purchased. This keeps the links, keeps the content, and gives customers the opportunity to find other products we have. But I often wonder if we should allow this items to just 404 and be done with them. Here is an example. http://www.americanmusical.com/Item--i-dyn-bm5a-list. Any advice?
Technical SEO | | dianeb1520 -
Use 302 redirect when site crashes
My company has switched to a new ecommerce platform that we are not totally familiar with yet. As we've worked with it, we've had a couple situations where both the front and back ends of our site crashed simultaneously (always after installing a third party module). The platform's built-in backup solution hasn't been an option in those situations so we've been coming up with alternatives. We now have a duplicate of the site on our server for such emergencies. The plan is to have pages on the broken site point to the backup site using 302 redirects until the broken site is fixed. Is this correct usage of the 302 redirect? I often see people recommend to never use 302 redirects, but I thought this might be the kind of situation where they'd be appropriate. If so, are there other SEO considerations we should keep in mind? For example, I'm wondering if we should put canonical tags on the temporary site that point to the broken site so the broken site stays in the SE indexes.
Technical SEO | | Kyle_M1 -
Site removed from Google Index
Hi mozers, Two months ago we published http://aquacion.com We registered it in the Google Webmaster tools and after a few day the website was in the index no problem. But now the webmaster tools tell us the URLs were manually removed. I've look everywhere in the webmaster tools in search for more clues but haven't found anything that would help me. I sent the acces to the client, who might have been stupid enough to remove his own site from the Google index, but now, even though I delete and add the sitemap again, the website won't show in Google SERPs. What's weird is that Google Webmaster Tools tells us all the page are indexed. I'm totally clueless here... Ps. : Added screenshots from Google Webmaster Tools. Update Turns out it was my mistake after all. When my client developped his website a few months ago, he published it, and I removed the website from the Google Index. When the website was finished I submited the sitemap, thinking it would void the removal request, but it don't. How to solve In webmaster tools, in the [Google Index => Remove URLs] page, you can reinclude pages there. tGib0
Technical SEO | | RichardPicard0 -
Basic Multi-Site Question
Newb question. We run a site in multiple cities under the same domain. Often times one city will provide content that is "syndicated" to other cites. For example, here is the master post: http://www.styleblueprint.com/food-and-entertaining/kale-salad-quick-healthy/ The content will also show up in the following domains: http://atlanta.styleblueprint.com/food-and-entertaining/kale-salad-quick-healthy/ http://birmingham.styleblueprint.com/food-and-entertaining/recipes/kale-salad-quick-healthy/ Should I be marketing the posts in Atlanta and Birmingham as "no index, no follow" for SEO purposes? Thanks in advance, Jay
Technical SEO | | SSBCI0 -
On-Site Sitemaps - Guidance Required
Hi, I am looking to find good examples of on-site sitemaps. We already submit our XML sitemap regularly through GWMT but I now wonder if we still need an on-site sitemap, as we have about 30 static pages and 300+ Wordpress blogs which in a sense makes that a spammy page as it has too many links and a higher than average keyword density. The reason I am looking for good examples is that I want to create a basic on-site sitemap that aids navigation but is styled to look ok as well. The Solution I have in mind: mydomain.com/link-example-one.php
Technical SEO | | tdsnet
mydomain.com/link-example-two.php
mydomain.com/liink-example-ten.php mydomain.com/blog then links to my 300 WP blogs, broken down into chunks navigated by using breadcrumbs. Will Google crawl this ok or should I stick to the current format listing ALL posts on one page? Thanks0