Cloaking - is this still working ? And how ?
-
Hello,
Recently i read about all the cloaking world.
I search some information on the internet about it and i fine this service : http://justcloakit.com/.
Since I'm pretty new to whole this "cloaking world" so I have a few questions from from experts in this field.Is this still working on SEO since all the Google update recently ?
How easy is that for someone that don't have much experience and knowledge on php and servers stuff ?
Is there are more sites such as the above example ?In general i have the budget and i don't think its very hard to learn all the technical part but i just want to know if this is something
that still working, is that good investment in your opinion ? (As its not really cheap)Cheers and thank you for your help
-
There is no harm in testing stuff, but as Ruben says, don't put it on your main business website or host on the same c-block.
I've been spammier in my 'youth' and totally get where you're coming from - it can be a real buzz getting one over on Google. BUT I now work in a purely white hat way, and the buzz you get from doing it 'right' is much better - AND the results last for longer so it's a win win for me!
-
I hear ya. It's always fun to run experiments and learn something new. Whatever your reasons for pursuing it, just make sure it doesn't link to any of your other sites and isn't hosted on the same c-block...at the minimum.
Best,
Ruben
-
Thank you for your respond.
I know that Google smart these days and i will never use it for one of my sites.
I do have some domain that lying there that i will be happy to play with and see if i can get results.
Im not talking to use it as my main strategy, its more for playing.And don't you think that you can learn a lot from this ?
What Google's red line?
Which niche its working and not ?I see it more research Tools and if i will make some money from that so why not.
In the end i believe that Google its a cray machines but in the end machines have bugs.I know that in here most people believe more in white hat and me too but i hear a lot of crazy thing on this.
Thank you
-
Don't do it. Ruben is right, it's risky and potentially dangerous. Use your budget for good
-
The generic response is:
1. Your budget is better spent developing high quality, unique content.
2. This is a violation of google's guidelines and will most likely get you penalized now or in the near future.
My response is:
1. If you want to commit yourself to trying to say ahead of everyone at google, by all means, have fun. However, by the time most black hat techniques have garnered as much attention as cloaking has, they have pretty much lived out their usefulness.
2. Admittedly, you're new to cloaking but confident you could figure it out...I just find that dangerous. I wouldn't implement something on my site, I didn't fully understand, especially if it's something google doesn't like.
My two cents.
Best,
Ruben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Excluding Googlebot From AB Test - Acceptable Sample Size To Negate Cloaking Risk?
My company uses a proprietary AB testing platform. We are testing out an entirely new experience on our product pages, but it is not optimized for SEO. The testing framework will not show the challenger recipe to search bots. With that being said, to avoid any risks of cloaking, what is an acceptable sample size (or percentage) of traffic to funnel into this test?
White Hat / Black Hat SEO | | edmundsseo0 -
No cache still a good link for disavow?
Hi Yall, 2 scenarios: 1. I'm on the border line of disavowing some websites that link to me. If the page is N/A (not available) for the cache, does that mean i should disavow them? 2. What if the particular page was really good content and the webmaster just has the worse seo skills in not interlinking his old blogs, hence why the page that's linking to me is N/A for cache, should i still disavow it? Thanks
White Hat / Black Hat SEO | | Shawn1240 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0 -
Old SPAM tactic still works and gets TOP 3 in SERP?
Hi Mozers, Below you can see some examples of spam ( hidden text and sneaky redirects) which are in SERP for our branded keywords during last 3 months. Some of them occupy very high position in SERP (top 3/top5). https://www.google.com/search?num=100&newwindow=1&safe=off&biw=1883&bih=1028&q=%22your+mac+-%22%2B%22cleanmymac%22 I sent spam reports and I’m going to continue doing so. (~500 spam reports from personal and work google account) I contacting directly with some of the hacked sites (web-masters) and tried to help them to fix this issue, but it takes a lot of my time. But 3 months!? Can you give me any advice, what doing next? Thank you!
White Hat / Black Hat SEO | | MacPaw0 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | | shags380 -
You're a SEO manager for a new company working on a new site. Where to?
So, you've recently begun as a SEO manager for a new company who's just launched a lovely, gleaming corporate site to boot. The onsite stuff is taken care of and your attention turns to link building. Now you've been in the game for a few years. You've seen things change in that time. Directories are out. Link networks are done. You're not going to embark on reciprocal linking either because it's bad and looks horribly tacky. Black Hat, White Hat - you know the score. You're lucky that the company produces a page or two of news a day - it's original, informative, is great for keeping your clients informed and you punt this on Twitter and FB. A bit of link bait, eh? But there's a rub: your competitors, with their bigger budgets, and industry clout, have been around for a some time longer than your company has been. They've snapped up all the good (industry-related) sites to get links from. You've approached all potential targets with the offer of good, relevant content and affiliate partnerships but they aren't having any of it. You're simply out-sized by the big boys next door - you can't compete. They're rich kids. There just seems nowhere to get links from. Do you just go the route of press releases and articles? Do you use paid blogging services? Grovel at doorsteps. The industry you're in is incredibly commercial - no meek altruist is going to take pity and give you a couple backlinks out of kindness. What do you do? What indeed...?
White Hat / Black Hat SEO | | Martin_S0 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0