Moving content in to tabs
-
Hi,
I'm kind of an SEO noobie, so please bare with me
On one of the sites I'm working on I got a request to move large blocks of content, just placed on the page currently, in to tabs.
This makes sense. We tried it and it makes navigating through the information much easier for visitors.
My question is: Will Google consider this as hiding information? It's not loaded dynamically. It's all their when the page is loaded, in the source, but not displayed until the visitor clicks the tab.
Will this cause SEO issues?
Thank you!
-
It works with ajax (some javascript and some css). The content is rendred inside the page, but is only displayed (via javascript) when the visitor clicks a tab.
-
What are you using to accomplish this ? CSS ? HTML5 ? AJAX or something else ? All these technologies have a Search Engine Friendly way of doing it. It really depends how you are doing it. It sounds like AJAX to me, but could also be CSS...so..depends.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Dublicate Content: Almost samt site on different domains
Hi, I own a couple of casting websites, which I'm at the moment launching "local" copies of all over the world. When I launch my website in a new country, the content is basically allways the same, except the language sometimes changes country for country. The domains will vary, so the sitename would be site.es for Spain, site.sg for Singapore, site.dk for Denmark and so. The websites will also feature diffent jobs (castings) and diffent profiles on the search.pages and so, BUT the more static pages are the same content (About us, The concept, Faq, Create user and so). So my Questions are: Is this something that is bad for Google SEO? The sites are atm NOT linking to each other with language-flags or anything - Should I do this? Basically to tell google that
Algorithm Updates | | KasperGJ
the business behind all these sites are somewhat big. Is there a way to inform Google on, that these sites should NOT be treated as dublicate content (Canonical tag wont do, since I want the "same" content to be listet on the locally Google sites). Hope there is some experts here which can help. /Kasper0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Google indexing site content that I did not wish to be indexed
Hi is it pretty standard for Google to index content that you have not specifically asked them to index i.e. provided them notification of a page's existence. I have just been alerted by 'Mention' about some new content that they have discovered, the page is on our site yes and may be I should have set it to NO INDEX but the page only went up a couple of days ago and I was making it live so that someone could look at it and see how the page was going to look in its final iteration. Normally we go through the usual process of notifying Google via GWMT, adding it to our site map.xml file, publishing it via our G+ stream and so on. Reviewing our Analytics it looks like there has been no traffic to this page yet and I know for a fact there are no links to this page. I am surprised at the speed of the indexation, is it a example of brand mention? Where an actual link is now no longer required? Cheers David
Algorithm Updates | | David-E-Carey0 -
Am I doing enough to rid duplicate content?
I'm in the middle of a massive cleanup effort of old duplicate content on my site, but trying to make sure I'm doing enough. My main concern now is a large group of landing pages. For example: http://www.boxerproperty.com/lease-office-space/office-space/dallas http://www.boxerproperty.com/lease-office-space/executive-suites/dallas http://www.boxerproperty.com/lease-office-space/medical-space/dallas And these are just the tip of the iceberg. For now, I've put canonical tags on each sub-page to direct to the main market page (the second two both point to the first, http://www.boxerproperty.com/lease-office-space/office-space/dallas for example). However this situation is in many other cities as well, and each has a main page like the first one above. For instance: http://www.boxerproperty.com/lease-office-space/office-space/atlanta http://www.boxerproperty.com/lease-office-space/office-space/chicago http://www.boxerproperty.com/lease-office-space/office-space/houston Obviously the previous SEO was pretty heavy-handed with all of these, but my question for now is should I even bother with canonical tags for all of the sub-pages to the main pages (medical-space or executive-suites to office-space), or is the presence of all these pages problematic in itself? In other words, should http://www.boxerproperty.com/lease-office-space/office-space/chicago and http://www.boxerproperty.com/lease-office-space/office-space/houston and all the others have canonical tags pointing to just one page, or should a lot of these simply be deleted? I'm continually finding more and more sub-pages that have used the same template, so I'm just not sure the best way to handle all of them. Looking back historically in Analytics, it appears many of these did drive significant organic traffic in the past, so I'm going to have a tough time justifying deleting a lot of them. Any advice?
Algorithm Updates | | BoxerPropertyHouston0 -
Why is there no compiled list of the different types of search results on Google, and what the content qualifications are to generate those results?
Seems to me that this list should exist out there somewhere, but I can't seem to find it. Am I just not as good of a Googler as I thought I was?
Algorithm Updates | | Draftfcb0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Content below the fold and Panda Update
Hi I was at the linklove conference and I heard some worrying stories about the way content is formatted on a page being a factor in ehow has avoided being slapped. It was the first time I had heard the expression "below the fold..." I am producing some very sexy SERP's results and other sexier metrics are up too but I am concerened that thefurnituremarket.co.uk has a ton of images on the home page and the nice content is below all of them.. firstly is this content..."below the fold"? secondly I know the site is old but do you think when this panda update hits the UK... were will be penalised for the look of the site.. I know there was talk yesterday at the conference of coming up woth a tool to check this out... my gut says that this will be a factor... sooner rather than later hence I am looking at magento and how we can skin it to look nice and present products better.. I would be really interested to know what exactly is "below the fold" on the furnituremarket.co.uk and some thoughts on the whole ehow formatting issue..
Algorithm Updates | | robertrRSwalters0