How are these sites ranking!?!
-
One of our clients is in the insurance industry and over the last 12 months we have seen an increasing number of low quality, newly registered, spammy sites achieving top 5 rankings for major keywords, which in turn is having an adverse effect on the rankings for our client.
Does anyone have any idea how the following sites have managed to do this:
http://www.multiquotetaxi.co.uk/ - 2nd for taxi insurance
http://www.motortradefast.co.uk/ - 1st for motor trade insurance
http://www.traders-insurance.com/ - 3rd for motor trade insurance
http://www.multiquotefleet.co.uk/ - 1st for fleet insurance
We have tried reporting the above sites, tried holding out to see if they get penalised and tried figuring out what they have done ourselves but cannot see how they have managed it.
Any ideas at all?
-
David's right, unfortunately - even in the insurance industry, you see sites pop in and out, beating the big aggregators and private companies for short periods. It doesn't happen very often anymore with the blue-chip terms ([car insurance], [health insurance], etc.) but for the second-tier you can still see examples like this. Google isn't perfect at killing spam yet.
Keep in mind that some of the big algo updates regarding links in the past two years have not really been targeting spammers at all: they have been targeting otherwise legitimate businesses (from large corporations to "mom & pop shop" organisations) who've sought to use manipulative link dev tactics. The tactics that a lot of recent manual and algorithmic penalties have focused on are buying into link networks and schemes rather than outright automated, old-fashioned spam. You can still get rankings with blackhat / greyhat tactics and some of those rankings last a surprisingly long time.
I am not saying that the sites above are outright spam at all - they are affiliates who'll get burned in the end if their strategies aren't watertight, and they all seem to be owned by QuoteSearcher Ltd. Even the big guys in insurance get burned regularly - I'll stay away from citing actual examples due to past experience with agency-based client work, but big brands would rise and fall all the time during the five years I worked in insurance SEO.
-
Black hat still works and always will. You can wait and hope they lose rankings or decide how much risk you're willing to take and try to beat them at their own game.
-
If you domain authority or link profile is lower then the competition, you would probably rank lower. However, i do agree that there are websites which look ugly and poorly optimized and do look spamy ... but still rank better. I tried reporting it to google but it is a slow process with no guarantees.
-
Hi Syed
If you look into it properly, you will see that pretty much all of the links any of the sites have are firstly from completely spammy and unrelated sites and also use over optimised anchor text links.
As well as this, all of the above domains are less than two years old, which I would class as a relatively new site, especially in the insurance industry which is extremely competitive
-
All of these websites have 40+ page authority, 30+ domain authority and 200+ external links ... i think these are not new websites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Speed Testing Tools For Production Sites
Hi Guys, Any free site speed testing tools for sites in production, which are password protected? We want to test site speed before the new site goes live on top priority pages. Site is on Shopify – we tried google page insights while being logged into the production site but believe its just recording the speed of the password page. Cheers.
Intermediate & Advanced SEO | | brandonegroup1 -
Subfolder ranks worse than the rest of the site
We have the strangest problem. The blog for our website ranks very poorly: www.lifeionizers.com/blog = average position in SERPs = 200. The site itself has an average position in SERPs of 12. The blog has a few terms it ranks #1 for such as branded terms and: is mineral water alkaline = 1.3 kangen water vs alkaline water = 2.6 kangen water pyramid = 1.2 ph of redbull = 1.1 (Used by Google as answer in knowledge graph) But the blog ranks terribly for most search terms. This blog has about 440 pages of in-depth, well-written authoritative content. Readers are well engaged, the blog has a bounce rate of ~3.5% with average time on page of over 6 minutes. The problem can't be the quality of the content. Does Google levy penalties against specific subdirectories? Or is this a configuration problem? Bad links have been disavowed.
Intermediate & Advanced SEO | | karasd0 -
Mobile Site Annotations
Our company has a complex mobile situation, and I'm trying to figure out the best way to implement bidirectional annotations and a mobile sitemap. Our mobile presence consists of three different "types" of mobile pages: Most of our mobile pages are mobile-specific "m." pages where the URL is completely controlled via dynamic parameter paths, rather than static mobile URLs (because of the mobile template we're using). For example: http://m.example.com/?original_path=/directory/subdirectory. We have created vanity 301 redirects for the majority of these pages, that look like http://m.example.com/product that simply redirect to the previous URL. Six one-off mobile pages that do have a static mobile URL, but are separate from the m. site above. These URLs look like http://www.example.com/product.mobile.html Two responsively designed pages with a single URL for both mobile and desktop. My questions are as follows: Mobile sitemap: Should I include all three types of mobile pages in my mobile sitemap? Should I include all the individual dynamic parameter m. URLs like http://m.example.com/?original_path=/directory/subdirectory in the sitemap, or is that against Google's recommendations? Bidirectional Annotations: We are unable to add the rel="canonical" tag to the m. URLs mentioned in section #1 above because we cannot add dynamic tags to the header of the mobile template. We can, however, add them to the .mobile.html pages. For the rel="alternate" tags on the desktop versions, though, is it correct to use the dynamic parameter URLs like http://m.example.com/?original_path=/directory/subdirectory as the mobile version target for the rel="alternate" tag? My initial thought is no, since they're dynamic parameter URLs. Is there even any benefit to doing this if we can't add the bidirectional rel="canonical" on those same m. dynamic URLs? I'd be immensely grateful for any advice! Thank you so much!
Intermediate & Advanced SEO | | Critical_Mass0 -
Why is my site not ranked?
Hey, does enybody have an idea, why my site www.detox.si is not ranked for the KW detox in www.google.si (Slovenia). It is being indexed, but it does not rank and i have no idea why. Best, M.
Intermediate & Advanced SEO | | Spletnafuzija0 -
Potential problems with my site
Dear Mozzers I hope you can help me with the following problems: My site is up and running for a year now and may be there has been problem with the homepage, because it ranks on first page for a competitive keyword on Google.com and Google.com.au only, however in other countries it just shows up as internal page and does not rank well. google.com/Google.com.au: homepage ranks top 10 (example.com)
Intermediate & Advanced SEO | | SteveTran2013
Other countries (.co.uk, .ca..ect) an internal page shows up, example.com/internalpage.html - shows on page 3-4. I can not find the homepage of example.com anywhere around top 1000. Can you please tell me what are the potential problems. Thank you very much. BR/Tran0 -
Load balanced Site
Our client ecommerce site load from 3 different servers using load balancing. abc.com : IP: 222.222.222 Abc.com: IP: 111.111.111 For testing purpose 111.111.111 also point to beta.abc.com Now google crawling site beta.abc.com If we block beta.abc.com using robots.txt it will block google bot also , since beta.abc.com is really abc.com I know its confusing but I been trying to figure out. Ofcourse I can ask my dev to remove beta.abc.com make a seperate code and block it using .htaccess
Intermediate & Advanced SEO | | tpt.com0 -
New Domain Name For Site That Ranks Highly on Key Terms
Here's my problem -- which is actually a pretty good problem to have. My client is a speciality service provider in an extremely competitive field. It charges 3 to 5 times what others do for providing a super-premium level of service. It doesn't have -- nor does it want -- many customers. I can't go into details, but let's just say the business model is a bit like the charity or premium newsletter publishing model. It is extremely hard to recruit new members -- but once recruited, members tend to stay for a long time at high price points. Personal referral is key. As result of my efforts over the last 90 days, the client's SEO results have skyrocketed. After a couple of false starts, we have focussed on key terms the target demographic is likely to search, rather than the generic terms others in the industry use. We have also had great success with a social media strategy -- since the few people likely to be interested in paying such high prices know like-minded folks. For the first time, my client is getting "walk in" prospects. They are delighted! But they are not really walk-ins. They have already found the site -- either through SERPs or Facebook or Twitter. Now we need to get to the next level. Here's the problem: the client's domain name sucks. It is short, but combines an acronym with one of the words in its long-version name. It uses the British spelling version of the long name fragment, even though most Canadians now use American spelling. And it is a .ca, rather than a dot.com So I think we have to bite the bullet and change to the long, dot com version of the name, which is available and has the additional benefit of having embedded within it a key search term. I am basically an editorial/content guy and not a tech guy. The IT guys at my firm are strongly encouraging me to make the change...in very "colorful" language. We can certainly do 301 redirects at the page level. But I would like some additional validation before proceeding. My questions are: how much link juice might we lose? I've seen the figure of 10% bandied around. Is it accurate? might we see a temporary dip in results? If so, how long would it last? what questions did I forget to ask? What additional info do you need to offer informed advice ?
Intermediate & Advanced SEO | | DanielFreedman0 -
One site or five sites for geo targeted industry
OK I'm looking to try and generate traffic for people looking for accommodation. I'm a big believer in the quality of the domain being used for SEO both in terms of the direct benefit of it having KW in it but also the effect on CTR a good domain can have. So I'm considering these options: Build a single site using the best, broad KW-rich domain I can get within my budget. This might be something like CheapestHotelsOnline.com Advantages: Just one site to manage/design One site to SEO/market Better potential to resell the site for a few million bucks Build 5 sites, each catering to a different region using 5 matching domains within my budget. These might be domains like CheapHotelsEurope.com, CheapHotelsAsia.com etc Advantages: Can use domains that are many times 'better' by adding a geo-qualifier. This should help with CTR and search Can be more targeted with SEO & Marketing So hopefully you see the point. Is it worth the dilution of SEO & marketing activities to get the better domain names? I'm chasing the longtail searchs whetever I do. So I'll be creating 5K+ pages each targeting a specific area. These would be pages like CheapestHotelsOnline.com/Europe/France/Paris or CheapHoteslEurope.com/France/Paris to target search terms targeting hotels in Paris So with that thought, is SEO even 100% diluted? Say, a link to the homepage of the first option would end up passing 1/5000th of value through to the Paris page. However a link to the second option would pass 1/1000th of the link juice through to the Paris page. So by thet logic, one only needs to do 1/5th of the work for each of the 5 sites ... that implies total SEO work would be the same? Thanks as always for any help! David
Intermediate & Advanced SEO | | OzDave0