Client wants to show 2 different types of content based on cookie usage - potential cloaking issue?
-
Hi,
A client of mine has compliance issues in their industry and has to show two different types of content to visitors:
Next year, they have to increase that to three different types of customer. Rather than creating a third section (customer-c), because it's very similar to one of the types of customers already (customer-b), their web development agency is suggesting changing the content based on cookies, so if a user has indentified themselves as customer-b, they'll be shown /customer-b/, but if they've identified themselves as customer-c, they'll see a different version of /customer-b/ - in other words, the URL won't change, but the content on the page will change, based on their cookie selection.
I'm uneasy about this from an SEO POV because:
- Google will only be able to see one version (/customer-b/ presumably), so it might miss out on indexing valuable /customer-c/ content,
- It makes sense to separate them into three URL paths so that Google can index them all,
- It feels like a form of cloaking - i.e. Google only sees one version, when two versions are actually available.
I've done some research but everything I'm seeing is saying that it's fine, that it's not a form of cloaking. I can't find any examples specific to this situation though. Any input/advice would be appreciated.
Note: The content isn't shown differently based on geography - i.e. these three customers would be within one country (e.g. the UK), which means that hreflang/geo-targeting won't be a workaround unfortunately.
-
Thanks Peter - I didn't know you could do that. I'll pass it on to the developers (who might already know, but wouldn't hurt to reinforce its importance).
-
Thanks Russ. I think the differences to the content between the two will only be minor/superficial, so I guess the approach makes sense and shouldn't affect the SEO side of things too much.
-
You can return same page with different content based on cookie safe. Just don't forget to add "Vary: Cookie" in headers. This will to told browsers and bots that this content is different based on cookie.
-
I think this sounds perfectly fine. It is highly unlikely that you will see any problems from this, just don't expect to rank for content that is hidden behind a cookie-based authentication. It might not be best-practice in Google's eyes, but it isn't going to trigger any kind of penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content update on 24hr schedule
Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!
Intermediate & Advanced SEO | | HashtagHustler0 -
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Infinite scrolling issue?
Hi Guys, Reviewing this E-commerce page - https://tinyurl.com/ybjjwr65 Based on this Google article: https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html It mentions: Make sure that you or your content management system produces a paginated series (component pages) to go along with your infinite scroll. How would you check this, is there a tool to conduct this test? Cheers.
Intermediate & Advanced SEO | | kayl870 -
No content in the view source, why?
Hi I have a website that you don't see the article body in the view source but if you use the inspect element tool you can see the content, do you know why? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Need some expert help – My Client bought out competitor and now wants to completely duplicate the current site with the same stock & categories using the Competitor brand
I am the SEO consultant for a large online homewares store. This company currently ranks very well in Google. I can PM the domain name if anyone needs however i don't want to post it on this forum. The company has bought out a competitor and plan to use the same warehouse, same products, and same back-end system as the current site, so they want to completely duplicate the current website. Titles, meta descriptions, product descriptions will all be renamed/rewritten/reworded (however keep in mind there are not many ways to reword a 3 piece saucepan set) Pricing will mostly be the same (some difference though), images cannot be renamed, categories cannot be renamed... the structure of the site will be exactly the same... placement etc. (however will have different banners, logo etc.) I personally don't believe the new site will rank, because it will be too similar. Can someone please offer me a 2nd opinion... Thanks
Intermediate & Advanced SEO | | ryanlenton0 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
Duplicate page Content
There has been over 300 pages on our clients site with duplicate page content. Before we embark on a programming solution to this with canonical tags, our developers are requesting the list of originating sites/links/sources for these odd URLs. How can we find a list of the originating URLs? If you we can provide a list of originating sources, that would be helpful. For example, our the following pages are showing (as a sample) as duplicate content: www.crittenton.com/Video/View.aspx?id=87&VideoID=11 www.crittenton.com/Video/View.aspx?id=87&VideoID=12 www.crittenton.com/Video/View.aspx?id=87&VideoID=15 www.crittenton.com/Video/View.aspx?id=87&VideoID=2 "How did you get all those duplicate urls? I have tried to google the "contact us", "news", "video" pages. I didn't get all those duplicate pages. The page id=87 on the most of the duplicate pages are not supposed to be there. I was wondering how the visitors got to all those duplicate pages. Please advise." Note, the CMS does not create this type of hybrid URLs. We are as curious as you as to where/why/how these are being created. Thanks.
Intermediate & Advanced SEO | | dlemieux0