Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
-
Hi,
A client of mine has compliance issues in their industry and has to show two different types of content to visitors:
Next year, they have to increase that to three different types of customer. Rather than creating a third section (customer-c), because it's very similar to one of the types of customers already (customer-b), their web development agency is suggesting changing the content based on cookies, so if a user has indentified themselves as customer-b, they'll be shown /customer-b/, but if they've identified themselves as customer-c, they'll see a different version of /customer-b/ - in other words, the URL won't change, but the content on the page will change, based on their cookie selection.
I'm uneasy about this from an SEO POV because:
- Google will only be able to see one version (/customer-b/ presumably), so it might miss out on indexing valuable /customer-c/ content,
- It makes sense to separate them into three URL paths so that Google can index them all,
- It feels like a form of cloaking - i.e. Google only sees one version, when two versions are actually available.
I've done some research but everything I'm seeing is saying that it's fine, that it's not a form of cloaking. I can't find any examples specific to this situation though. Any input/advice would be appreciated.
Note: The content isn't shown differently based on geography - i.e. these three customers would be within one country (e.g. the UK), which means that hreflang/geo-targeting won't be a workaround unfortunately.
-
Thanks Peter - I didn't know you could do that. I'll pass it on to the developers (who might already know, but wouldn't hurt to reinforce its importance).
-
Thanks Russ. I think the differences to the content between the two will only be minor/superficial, so I guess the approach makes sense and shouldn't affect the SEO side of things too much.
-
You can return same page with different content based on cookie safe. Just don't forget to add "Vary: Cookie" in headers. This will to told browsers and bots that this content is different based on cookie.
-
I think this sounds perfectly fine. It is highly unlikely that you will see any problems from this, just don't expect to rank for content that is hidden behind a cookie-based authentication. It might not be best-practice in Google's eyes, but it isn't going to trigger any kind of penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The client wants to close the current e-commerce website and open a new one.
The client wants to close the current e-commerce website and open a new one on a completely different engine without losing income. I have no idea how to approach this topic. Old site has over 100 000 pages, and in terms of SEO is quite great - we hit almost every important keyword in our niche but thanks to heavy modifications of source code site become unmaintainable. Content on new shop will be almost 1:1 with old page but: domain will be different (I can't explain to the client that this will damage our core brand). Beacuse of that I'm forcing idea of going with brandname.com/shop domain instead of newshop.com beacuse our main brand is well known to our customers, not as much as old shop but still better than new shop brand. engine and design will be different we will lost almost 30 000 backlinks. budget: only IT. No content and seo tools budget. BONUS: client hired before me some "SEO magician" - now SEO audit score with tools like ahrefs etc. is around 6 - 12% for 100 000 pages on new shop. Great. Does anyone have idea how to approach such task with minimal losses?
Intermediate & Advanced SEO | | meliegree0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Duplicate Content Issues :(
I am wondering how we can solve our duplicate content issues. Here is the thing: There are so many ways you can write a description about a used watch. http://beckertime.com/product/mens-rolex-air-king-no-date-stainless-steel-watch-wsilver-dial-5500/ http://beckertime.com/product/mens-rolex-air-king-stainless-steel-date-watch-wblue-dial-5500/ Whats different between these two? The dial color. We have a lot of the same model numbers but with different conditions, dial colors, and bands.. What ideas do you have?
Intermediate & Advanced SEO | | KingRosales0 -
Anchor Text Usage
Hi, What is the best way to use anchor text during link building after recent updates from Google. I thinking of doing the following: 60% Brand Keyword (my site name)
Intermediate & Advanced SEO | | Vegitss
20% Click here, visit this site etc
20% myurl.com
10% a Mix of both broad & phrase match of my targetted keyword. What do you suggest Does anyone have a working strategy? Will be waiting for your replies...0 -
Duplicate Page Title/Content Issues on Product Review Submission Pages
Hi Everyone, I'm very green to SEO. I have a Volusion-based storefront and recently decided to dedicate more time and effort into improving my online presence. Admittedly, I'm mostly a lurker in the Q&A forum but I couldn't find any pre-existing info regarding my situation. It could be out there. But again, I'm a noob... So, in my recent SEOmoz report I noticed that over 1,000 Duplicate Content Errors and Duplicate Page Title Errors have been found since my last crawl. I can see that every error is tied to a product in my inventory - specifically each product page has an option to write a review. It looks like the subsequent page where a visitor can fill out their review is the stem of the problem. All of my products are shown to have the same issue: Duplicate Page Title - Review:New Duplicate Page Content - the form is already partially filled out with the corresponding product My first question - It makes sense that a page containing a submission form would have the same title and content. But why is it being indexed, or crawled (or both for that matter) under every parameter in which it could be accessed (product A, B, C, etc)? My second question (an obvious one) - What can I do to begin to resolve this? As far as I know, I haven't touched this option included in Volusion other than to simply implement it. If I'm missing any key information, please point me in the right direction and I'll respond with any additional relevant information on my end. Many thanks in advance!
Intermediate & Advanced SEO | | DakotahW0 -
Virtual Domains and Duplicate Content
So I work for an organization that uses virtual domains. Basically, we have all our sites on one domain and then these sites can also be shown at a different URL. Example: sub.agencysite.com/store sub.brandsite.com/store Now the problem comes up often when we move the site to a brand's URL versus hosting the site on our URL, we end up with duplicate content. Now for god knows what damn reason, I currently cannot get my dev team to implement 301's but they will implement 302's. (Dont ask) I also am left with not being able to change the robots.txt file for our site. They say if we allowed people to go in a change this stuff it would be too messy and somebody would accidentally block a site that was not supposed to be blocked on our domain. (We are apparently incapable toddlers) Now I have an old site, sub.agencysite.com/store ranking for my terms while the new site is not showing up. So I am left with this question: If I want to get the new site ranking what is the best methodology? I am thinking of doing a 1:1 mapping of all pages and set up 302 redirects from the old to the new and then making the canonical tags on the old to reflect the new. My only thing here is how will Google actually view this setup? I mean on one hand I am saying
Intermediate & Advanced SEO | | DRSearchEngOpt
"Hey, Googs, this is just a temp thing." and on the other I am saying "Hey, Googs, give all the weight to this page, got it? Graci!" So with my limited abilities, can anybody provide me a best case scenario?0 -
2 sites or one sites: 2 locations
Hello, I have a dog training client who is offering services in 2 separate locations. We're looking to be first in the non-local search results and also rank well in google places. Would it be better to go for 2 separate sites or one site and try to rank for 2 different locations with one site? There's both local and standard search results when we type in our keywords. Thanks!
Intermediate & Advanced SEO | | BobGW0 -
SEOMoz Internal Dupe. Content & Possible Coding Issues
SEOmoz Community! I have a relatively complicated SEO issue that has me pretty stumped... First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point. Now, the issues I am having: I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments). I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages: Car Financing How It Works A Home Loan is Possible with Bad Credit (Please let me know if you could use more examples) At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords. All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not! I could really use some significant help here... Both of our sites have a number of access points: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such. However, my sites can also be reached via the following: http://www.federalautoloan.com http://www.federalautoloan.com/default.aspx http://www.federalmortgageservices.com http://www.federalmortgageservics.com/default.aspx Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags? I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details. Thank you all in advance for the help! I greatly appreciate it! F7dWJ.png dN9Xk.png dN9Xk.png G62JC.png ABL7x.png 7yG92.png
Intermediate & Advanced SEO | | WPColt0