Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does collapsing content impact Google SEO signals?
-
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content? -
Thanks EGOL. Still looking for additional evidence about this.
-
well.. yup. I know many SEOs that do think that the collapsable are is just not important enough for google to consider it
good luck
-
If I see a study, I'll post a link here.
-
Yep I completely agree with your response. Unfortunately I'm in a position where I manage major enterprise accounts with multiple stakeholders (including some people are not educated in SEO). Every major change we propose needs to be documented, cited and reviewed. When making an argument for content expansion I would need to use thorough research example (Moz study, documentation on search engine land, etc).
Anyway thank for taking the time to share your feedback and advice on this thread. Although this is not the answer I wanted to hear (i.e. Google doesn't respect collapsed content)...however it's very likely accurate. This is a serious SEO issue that needs to be addressed.
-
Are there any case studies about this issue?
Just the one that I published above. The conclusion is... be prepared to sacrifice 80% of your traffic if you hide your valuable content behind a preview.
I would be asking the UX people to furnish studies that hiding content produces better sales.
We have lots of people raving about the abundance of content on our site, the detailed product descriptions, how much help we give them to decide what to purchase. All of this content is why we dominate the SERPs in our niche and that, in many people's eyes, is a sign of credibility. Lots of people say... "we bought from you because your website is so helpful". However, if we didn't have all of this content in the open these same people would have never even found us.
Nobody has to read this stuff. I would rather land on a website and see my options than land on a website and assume that they was no information because I didn't notice that the links to open it were in faded microfont because the UX guys wanted things to be tidy. I believe that it is a bigger sin to have fantastic content behind a clickthorugh than it is to put valuable information in the open and allow people to have the opportunity to read it.
Putting our content out in the open is what makes our reputation.
I sure am glad that I am the boss here. I can make the decisions and be paid on the basis of my performance.
-
We are applying 500 to 800+ word custom content blocks for our client landing pages (local landing pages) that shows a preview of the first paragraph and a "read more" expansion link. We know that most website visitors only care about the location info of these particular landing pages. We also know that our client UX teams would certainly not approve an entire visible content block on these pages.
Are there any case studies about this issue? I'm trying to find a bona fide research project to help back up our argument. -
It was similar to a Q&A. There was a single sentence question and a paragraph of hidden answer. This page had a LOT of questions and a tremendous amount of keywords in the hidden content. Thousands of words.
The long tail traffic tanked. Then, when we opened the content again the traffic took months to start coming back. The main keywords held in the SERPs. The longtail accounted for the 80% loss.
-
How collapsed was your content? Did you hide the entire block? Only show a few sentences? I'm trying to find a research article about this. This is a MAJOR issue to consider for our SEO campaigns.
-
Yes that is a very legitimate concern of mine. We have invested significant resources into custom long form content for our clients and we are very concerned this all for nothing...or possibly worse (discounting content).
-
Recently i a had related issue with a top ranking website for very competitive queries.
Unfortunately the product department made some changes to the content (UI only) without consulting SEO department. The only worth to mention change they made was to move the first two paragraphs into a collapsible DIV showing only the first 3 lines + a "read more" button. The text in collapsible div was crawlable and visible to SE's. (also it's worth to mention that these paragrap
But the site lost its major keywords positions 2-3 days later.Of-course we reverted the changes back but still two months later, the keywords are very slowly moving back to their "original" positions.
For years i believed in what Google stated, that you can use collapsible content if you are not trying to inject keywords or trying to inflate the amount of content etc. Not anymore.
I believe that placing the content under a collapsible div element, we are actually signaling google that this piece of content is not that important (that's why it is hidden, right? Otherwise it should be in plain sight). So why we should expect from google to take this content as a major part of our contents ranking factor weight.
-
About two years ago I had collapsed content on some important pages. Their longtail traffic went into a steady slide, but the head traffic held. I attribute this to a sign that the collapsed content was discounted, removing it from, or lowering its ability to count in the rankings for long tail queries.
I expanded the page, making all content visible. A few months later, longtail traffic started to slowly rise. It took many months to climb back to previous levels.
After this, every word of my content is now in the open.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Having 2 brands with the same content - will this work from an SEO perspective
Hi All, I would love if someone could help and provide some insights on this. We're a financial institution and have a set of products that we offer. We have recently joined with another brand and will now be offering all our products to their customers. What we are looking to do is have 1 site that masks the content for both sites so it appears as there are 2 seperate brands with different content - in fact we have a main site and then a sister brand that offers the same products. Is there anyway to do this so when someone searches for Credit Card from Brand A it is indexed under Brand A and same when someone searched for Credit Card from Brand B it is indexed under Brand B. The one thing is we would not want to rel:can the pages nor be penalised by googles latest PR algorithm. Hope someone can help! Thanks Dave
Intermediate & Advanced SEO | | CFCU1 -
How search engines look at collapse content in mobile while on desktop it open by default?
Hello everyone!
Intermediate & Advanced SEO | | Roi_Bar
To have a mobile friendly UX we chose to collapse some of the page content.
On the desktop it is in open mode by default and user can see the whole content.
Does the search engines see the content even if it's collapse? is the collapse mode on the mobile only can hurt us with SERP ranking? okgF0pX 1LU6utU1 -
Medical / Health Content Authority - Content Mix Question
Greetings, I have an interesting challenge for you. Well, I suppose "interesting" is an understatement, but here goes. Our company is a women's health site. However, over the years our content mix has grown to nearly 50/50 between unique health / medical content and general lifestyle/DIY/well being content (non-health). Basically, there is a "great divide" between health and non-health content. As you can imagine, this has put a serious damper on gaining ground with our medical / health organic traffic. It's my understanding that Google does not see us as an authority site with regard to medical / health content since we "have two faces" in the eyes of Google. My recommendation is to create a new domain and separate the content entirely so that one domain is focused exclusively on health / medical while the other focuses on general lifestyle/DIY/well being. Because health / medical pages undergo an additional level of scrutiny per Google - YMYL pages - it seems to me the only way to make serious ground in this hyper-competitive vertical is to be laser targeted with our health/medical content. I see no other way. Am I thinking clearly here, or have I totally gone insane? Thanks in advance for any reply. Kind regards, Eric
Intermediate & Advanced SEO | | Eric_Lifescript0 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
Is tabbed content bad for SEO?
I work for a Theater show listings and ticketing website. In our show listings pages (e.g. http://www.theatermania.com/broadway/this-is-our-youth_302998/) we split our content into separate tabs (overview, pricing and show dates, cast, and video). Are we shooting ourselves in the foot by separating the content? Are we better served with keeping it all in a single page? Thanks so much!
Intermediate & Advanced SEO | | TheaterMania0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1