Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
-
Hi,
A client of mine has compliance issues in their industry and has to show two different types of content to visitors:
Next year, they have to increase that to three different types of customer. Rather than creating a third section (customer-c), because it's very similar to one of the types of customers already (customer-b), their web development agency is suggesting changing the content based on cookies, so if a user has indentified themselves as customer-b, they'll be shown /customer-b/, but if they've identified themselves as customer-c, they'll see a different version of /customer-b/ - in other words, the URL won't change, but the content on the page will change, based on their cookie selection.
I'm uneasy about this from an SEO POV because:
- Google will only be able to see one version (/customer-b/ presumably), so it might miss out on indexing valuable /customer-c/ content,
- It makes sense to separate them into three URL paths so that Google can index them all,
- It feels like a form of cloaking - i.e. Google only sees one version, when two versions are actually available.
I've done some research but everything I'm seeing is saying that it's fine, that it's not a form of cloaking. I can't find any examples specific to this situation though. Any input/advice would be appreciated.
Note: The content isn't shown differently based on geography - i.e. these three customers would be within one country (e.g. the UK), which means that hreflang/geo-targeting won't be a workaround unfortunately.
-
Thanks Peter - I didn't know you could do that. I'll pass it on to the developers (who might already know, but wouldn't hurt to reinforce its importance).
-
Thanks Russ. I think the differences to the content between the two will only be minor/superficial, so I guess the approach makes sense and shouldn't affect the SEO side of things too much.
-
You can return same page with different content based on cookie safe. Just don't forget to add "Vary: Cookie" in headers. This will to told browsers and bots that this content is different based on cookie.
-
I think this sounds perfectly fine. It is highly unlikely that you will see any problems from this, just don't expect to rank for content that is hidden behind a cookie-based authentication. It might not be best-practice in Google's eyes, but it isn't going to trigger any kind of penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
Different content on pages with the same URL--except one is at www and the other at www2
Hi! I have two pages with unique content on each. However, they have virtually the same URL--except one is a www and the other is a www2. As far as I know, both pages were meant to gain organic traction. How should this situation be handled for SEO purposes? Thanks for any help! ---Ivey
Intermediate & Advanced SEO | | Nichiha0 -
Building a product clients will integrate into their sites: What is the best way to utilize my clients' unique domain names?
I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites. It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
Intermediate & Advanced SEO | | emzeegee
All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.
I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
Their nav would include an actual link to product.mydomain.com/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
Their nav would include an actual link to mydomain.com/product/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!0 -
Have a Robots.txt Issue
I have a robots.txt file error that is causing me loads of headaches and is making my website fall off the SE grid. on MOZ and other sites its saying that I blocked all websites from finding it. Could it be as simple as I created a new website and forgot to re-create a robots.txt file for the new site or it was trying to find the old one? I just created a new one. Google's website still shows in the search console that there are severe health issues found in the property and that it is the robots.txt is blocking important pages. Does this take time to refresh? Is there something I'm missing that someone here in the MOZ community could help me with?
Intermediate & Advanced SEO | | primemediaconsultants0 -
Wordpress uploads folder issues
Hi, i have recently moved my wordpress blog to a new server.. Previously I had a url as website.com/blog My blog site is now running on the domain website.com Now most of my images are in the correct folder path wp-content/uploads Howerver, some of my images are pointing to a folder /blog/wp-content/uploads and so I am getting many missing image on the front end. How do i get the /blog/wp-content/uploads point to the new url wp-content/uploads Thanks guys.. Taiger
Intermediate & Advanced SEO | | Taiger0 -
Using the same picture, but 2 differents pages and Alt descriptions
Hi Moz experts, I have a quick technical question for you. If we would like to use the same picture on the website, but using differ Alt description and also, be sure to customize it to make sure, it will not look like the other. This strategy is for saving cost on new picture. Do you think google will see this as iidentical content ? Thank you for your hands up on this question.
Intermediate & Advanced SEO | | johncurlee0 -
301 or 404 Question for thin content Location Pages we want to remove
Hello All, I have a Hire Website with many categories and individual location pages for each of the 70 depots we operate. However, being dynamic pages, we have thousands of thin content pages. We have decided to only concentrate on our best performing locations and get rid of the rest as its physically impossible to write unique content for all our location pages for every categories. Therefore my question is. Would it cause me problems by having to many 301's for the location pages I am going to re-direct ( i was only going to send these back to the parent category page) or should I just 404 all those location pages and at some point in the future when we are in a position to concentrate on these locations then redo them with new content ? in terms of url numbers It would affect a few thousand 301's or 404's depending on people thoughts. Also , does anyone know what percentage of thin content on a site should be acceptable ?.. I know , none is best in an ideal world but it would be easier if there we could get away with a little percentage. We have been affected by Panda , so we are trying to tidy things up as best at possible, Any advice greatly appreciated? thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Site Structured Navigated by Cookies
Is it advisable to have a site structure that is navigated via URLs rather than cookies? In a website that has several location based pages - each with their own functions and information? Is this a SEO priority? Will it help to combat duplicate content? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | J_Sinclair0