Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using a lot of "Read More" Hidden text
-
My site has a LOT of "read more" and when a user click they will see a lot of text. "read more" is dark blue bold and clear to the user. It is the perfect for the user experience, since right below I have pictures and videos which is what most users want.
Question: I expect few users will click "Read more" (however, some users will appreciate chance to read and learn more) and I wonder if search engines may think I am hiding text and this is a risky approach or simply discount the text as having zero value from an SEO perspective?
Or, equally important: If the text was NOT hidden with a "Read more" would the text actually carry more SEO value than if it is hidden under a "read more" even though users will NOT read the text anyway? If yes, reason may be: when the text is not hidden, search engines cannot see that users are not reading it and the text carry more weight from an SEO perspective than pages where text is hidden under a "Read more" where users rarely click "read more".
-
Hi khi5
I analyzed your page. You are doing just fine. you are using CSS display none. You are not doing any cloaking.
You are doing the right thing.
1. not fooling google
2.not fooling user
3.giving the user a better user experience.
Don't worry you are not applying any "black hat" technique. You will not get penalized.
-
thx. Anirban. I am not a programmer, so would you be able to tell me if this approach seems right: http://www.honoluluhi5.com/oahu/honolulu-condos/ - I don't know if css or display none.
I can't think of a better layout for that page and hiding text the way I have done it is ideal for users. If I show more text, surely bounce rate would go up!
-
It used by many huge sites is to pre-load code, navigation, or content in the background so that it can be dynamically displayed as needed. The most common technique for accomplishing this is through the use of the CSS display: none attribute.
Unfortunately, you can also use display: none to simply hide text. This is where the perceived problem comes in. People worry that the use of display: none to hide content(and show when user asks for it) or for code that is really meant for screen readers can lead them into trouble. The legitimate use of this technique is so prevalent that I would rarely expect search engines to penalize a site for using the display: none attribute. It’s just very difficult to implement an algorithm that could truly ferret out whether the particular use of display: none is meant to deceive the search engines or not.
I usually use this tactics to make the page more user friendly and it is useful for the user too. User don't get bombarded by a large content piece and I am not fooling the user/google. I am giving the option to the user to read more if he wants to.
"display: none"
What it does :- the functionality is same - when user clicks "read more" it opens and when user click "less" it closes.
How it defeats the "cloaking" idea:- When google crawls your page where the full content is there (text based browser, not java enabled) and when user sees the page there is a "read more" link and by clicking it it shows the full content. So you are not showing two different things to google & user. it solves the problem.there shouldn't be a a cloaking problem. Its tested.
Hope this helps...
Also refer :- http://moz.com/community/q/would-using-display-none-to-hide-a-section-of-text-effect-seo-negatively
-
Wow -- thanks for the links! I learn something new every day.
I'll defer to others on your specific question since I haven't ever worked with sites that specifically do what you do. I hope someone will give you a good answer!
-
http://searchengineland.com/googles-matt-cutts-on-hidden-text-using-expandable-sections-youll-be-in-good-shape-167753 - this is a more Matt Cutts video and more relevant, which again mentions it is OK to use those read more.
Again, my bigger concern is if it is OK, or I am probably safer off showing all text if possible….
-
thx, Sam. Here is a video from Matt Cutts: https://www.youtube.com/watch?v=UpK1VGJN4XY - it appears Google is OK with hidden text that makes sense for user.
For my site I have a lot of read more types like here:
http://www.honoluluhi5.com/oahu-condos/
http://www.honoluluhi5.com/oahu/honolulu-city-real-estate/As you can see from those 2 links, I have created with only the user in mind and nothing else. In order to play it safe, maybe I should just show all the text somehow, even though it compromises user experience.
-
The answer to your question lies in another question: Do search engines see one thing and users see another? If the answer is "yes," then you are using "cloaking" -- which is a very bad black-hat SEO technique. It can get you penalized and possibly banned.
Users don't see the text if they don't click "read more" but search engines will see the text either way? That's cloaking. I'd stop doing this right away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know how to fix this structured data error on search console? Invalid value in field "itemtype"
I'm getting the same structured data error on search console form most of my websites, Invalid value in field "itemtype" I take off all the structured data but still having this problem, according to Search console is a syntax problem but I can't find what is causing this. Any guess, suggestion or solution for this?
Intermediate & Advanced SEO | | Alexanders0 -
[Very Urgent] More 100 "/search/adult-site-keywords" Crawl errors under Search Console
I just opened my G Search Console and was shocked to see more than 150 Not Found errors under Crawl errors. Mine is a Wordpress site (it's consistently updated too): Here's how they show up: Example 1: URL: www.example.com/search/adult-site-keyword/page2.html/feed/rss2 Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword/page2.html Example 2 (this surprised me the most when I looked at the linked from data): URL: www.example.com/search/adult-site-keyword-2.html/page/3/ Linked From: www.example.com/search/adult-site-keyword-2.html/page/2/ (this is showing as if it's from our own site) http://a-spammy-adult-site.com/search/adult-site-keyword-2.html Example 3: URL: www.example.com/search/adult-site-keyword-3.html Linked From: http://an-adult-image-hosting.com/search/adult-site-keyword-3.html How do I address this issue?
Intermediate & Advanced SEO | | rmehta10 -
Are HTML Sitemaps Still Effective With "Noindex, Follow"?
A site we're working on has hundreds of thousands of inventory pages that are generally "orphaned" pages. To reach them, you need to do a lot of faceting on the search results page. They appear in our XML sitemaps as well, but I'd still consider these orphan pages. To assist with crawling and indexation, we'd like to create HTML sitemaps to link to these pages. Due to the nature (and categorization) of these products, this would mean we'll be creating thousands of individual HTML sitemap pages, which we're hesitant to put into the index. Would the sitemaps still be effective if we add a noindex, follow meta tag? Does this indicate lower quality content in some way, or will it make no difference in how search engines will handle the links therein?
Intermediate & Advanced SEO | | mothner0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Using a US CDN (Cloudflare) for a UK Site. Should I use a UK Based CDN as it says my server is based in USA
Hi All, We are a UK Company with Uk customers only and use CloudFlare CND. Our Site is hosted by a UK company with servers here but from looking online and checking where my site is hosted etc etc , some sites are telling me the name of our UK Hosted company and other sites are telling me my site is hosted in San Fran (USA) , where I presume the Cloudflare is based. I know Cloudflare has a couple of servers in the UK it uses but given all my customers are UK based ,I don't want this is affect rankings etc , as I thought it was a ranking benefit to be hosted in the country you are based. Is there any issue with this and should I change or is google clever enough to know so i shouldn't worry. thanks Pet
Intermediate & Advanced SEO | | PeteC120 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
Meta Keywords: Should we use them or not?
I am working through our site and see that meta keywords are being used heavily and unnecessarily. Each of our info pages will have 2 or 3 keyword phrases built into them. Should we just duplicate the keyword phrases into the meta keyword field, should put in additional keywords beyond or not use it at all? Thoughts and opinions appreciated
Intermediate & Advanced SEO | | Towelsrus1 -
Should I Use City Name in URL?
Having a website designed for a car dealership and deciding what attributes to use in the URL. Should I include the city name in the URL? Or does that help for SEO purposes? Other ideas of what to research or try are appreciated too. Thanks 🙂
Intermediate & Advanced SEO | | kylesuss0