Cloaking - is this still working ? And how ?
-
Hello,
Recently i read about all the cloaking world.
I search some information on the internet about it and i fine this service : http://justcloakit.com/.
Since I'm pretty new to whole this "cloaking world" so I have a few questions from from experts in this field.Is this still working on SEO since all the Google update recently ?
How easy is that for someone that don't have much experience and knowledge on php and servers stuff ?
Is there are more sites such as the above example ?In general i have the budget and i don't think its very hard to learn all the technical part but i just want to know if this is something
that still working, is that good investment in your opinion ? (As its not really cheap)Cheers and thank you for your help
-
There is no harm in testing stuff, but as Ruben says, don't put it on your main business website or host on the same c-block.
I've been spammier in my 'youth' and totally get where you're coming from - it can be a real buzz getting one over on Google. BUT I now work in a purely white hat way, and the buzz you get from doing it 'right' is much better - AND the results last for longer so it's a win win for me!
-
I hear ya. It's always fun to run experiments and learn something new. Whatever your reasons for pursuing it, just make sure it doesn't link to any of your other sites and isn't hosted on the same c-block...at the minimum.
Best,
Ruben
-
Thank you for your respond.
I know that Google smart these days and i will never use it for one of my sites.
I do have some domain that lying there that i will be happy to play with and see if i can get results.
Im not talking to use it as my main strategy, its more for playing.And don't you think that you can learn a lot from this ?
What Google's red line?
Which niche its working and not ?I see it more research Tools and if i will make some money from that so why not.
In the end i believe that Google its a cray machines but in the end machines have bugs.I know that in here most people believe more in white hat and me too but i hear a lot of crazy thing on this.
Thank you
-
Don't do it. Ruben is right, it's risky and potentially dangerous. Use your budget for good
-
The generic response is:
1. Your budget is better spent developing high quality, unique content.
2. This is a violation of google's guidelines and will most likely get you penalized now or in the near future.
My response is:
1. If you want to commit yourself to trying to say ahead of everyone at google, by all means, have fun. However, by the time most black hat techniques have garnered as much attention as cloaking has, they have pretty much lived out their usefulness.
2. Admittedly, you're new to cloaking but confident you could figure it out...I just find that dangerous. I wouldn't implement something on my site, I didn't fully understand, especially if it's something google doesn't like.
My two cents.
Best,
Ruben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Masking or Cloaking?
Hi Guy's, On our webshop we link from our menu to categories were we want to rank on in Google. Because the menu is sitewide i guess Google finds the categories in the menu important and meaby let them score better (onside links) The problem that i'm facing with is that we make difference in Gender. In the menu we have: Man and Woman. Links from the menu go to: /categorie?gender=1/ and /category?gender=2/. But we don't want to score on gender but on the default URL. For example: Focus keyword = Shoes Menu Man link: /shoes?gender=1 Menu Woman link: /shoes?gender=2 But we only want to rank on /shoes/. But that URL is not placed in the menu. Every URL with: "?" has a follow noindex. So i was thinking to make a link in the menu, on man and woman: /shoes/, but on mouse down (program it that way) ?=gender. Is this cloaking for Google? What we also could do is make a canonical to the /shoes/ page. But i don't know if we get intern linking value on ?gender pages that have a canonical. Hope it makes senses 🙂 Advises are also welcome, such as: Place al the default URL's in the footer.
White Hat / Black Hat SEO | | Happy-SEO0 -
Cloaking/Malicious Code
Does anybody have any experience with software for identifying this sort of thing? I was informed by a team we are working with that our website may have been compromised and I wanted to know what programs people have used to identify cloaking attempts and/or bad code. Thanks everybody!
White Hat / Black Hat SEO | | HashtagHustler0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0 -
Big Brands Still Paying For Links!
We have been spending a lot of time creating unique and relevant content that is helpful to users in order to garner natural links. However, I still see large companies getting paid links to their site. They still rank despite the paid links - many higher that before thanks to the increased brand/domain authority bias by Google. I have seen a number of blogs with posts that have dofollow links to sites like Amazon and Dirtdevil. Are small businesses just getting buried or am I being too cynical?
White Hat / Black Hat SEO | | inhouseseo0 -
I am still confused about anchor text and penalties
As I understand in order to rank well for the page in google, the page/site has to have a lot of back links that have an anchor text with the keywords that you want to rank for. At the same time if google finds that your anchor text contains kewords that are in your title or h1 tag, it may penalize your site. So what do i do to rank well for my chosen keywords. Lets say I am only interested in keywords San Francisco widget and Oakland widget The title of my webpage says San Francisco | Oakland widget The anchor text that I usually pick is either San Francisco widget or Oakland widget. I also have plenty of links that have anchor text like "website" or "click here" What should I use for my anchor text in my backlinks?
White Hat / Black Hat SEO | | SirMax0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2 -
Tricky Decision to make regarding duplicate content (that seems to be working!)
I have a really tricky decision to make concerning one of our clients. Their site to date was developed by someone else. They have a successful eCommerce website, and the strength of their Search Engine performance lies in their product category pages. In their case, a product category is an audience niche: their gender and age. In this hypothetical example my client sells lawnmowers: http://www.example.com/lawnmowers/men/age-34 http://www.example.com/lawnmowers/men/age-33 http://www.example.com/lawnmowers/women/age-25 http://www.example.com/lawnmowers/women/age-3 For all searches pertaining to lawnmowers, the gender of the buyer and their age (for which there are a lot for the 'real' store), these results come up number one for every combination they have a page for. The issue is the specific product pages, which take the form of the following: http://www.example.com/lawnmowers/men/age-34/fancy-blue-lawnmower This same product, with the same content (save a reference to the gender and age on the page) can also be found at a few other gender / age combinations the product is targeted at. For instance: http://www.example.com/lawnmowers/women/age-34/fancy-blue-lawnmower http://www.example.com/lawnmowers/men/age-33/fancy-blue-lawnmower http://www.example.com/lawnmowers/women/age-32/fancy-blue-lawnmower So, duplicate content. As they are currently doing so well I am agonising over this - I dislike viewing the same content on multiple URLs, and though it wasn't a malicious effort on the previous developers part, think it a little dangerous in terms of SEO. On the other hand, if I change it I'll reduce the website size, and severely reduce the number of pages that are contextually relevant to the gender/age category pages. In short, I don't want to sabotage the performance of the category pages, by cutting off all their on-site relevant content. My options as I see them are: Stick with the duplicate content model, but add some unique content to each gender/age page. This will differentiate the product category page content a little. Move products to single distinct URLs. Whilst this could boost individual product SEO performance, this isn't an objective, and it carries the risks I perceive above. What are your thoughts? Many thanks, Tom
White Hat / Black Hat SEO | | SoundinTheory0