Will a CSS Overflow Scroll for content affect SEO rankings?
-
If I use a CSS overflow scroll for copy, will my SEO rankings be affected? Will Google still be able to index my copy accurately and will keywords used in the copy that are covered by the scroll be recognized by Google?
-
Hidden means the text will never be shown to an end user under any circumstances. It could be in a div with "display:none" as its style, or white text on a white background.
-
Thanks!
-
OK so here is the deal, I have spoken with Google's John Mueller about this multiple times.
Your content will indeed be indexed, however Google bot is not as basic as many people think, there are multiple things happening when the bots come to look at your site, one of them is related to the humming bird algorithm. The bot takes a snapshot of what is actually visible on the page and will give THAT content more preference in search. It will also look what is above the fold for customers. So it is advisable to not have a large logo at the top and for some content to be visible without scrolling.
So just because some content is not visible does not mean it will not be indexed but it may be given far less weight than text that is immediately visible.
I hope that answers your question?
-
Thanks for the response Highland! Could you clarify "hidden?" Thanks again!
-
As long as the text isn't hidden it will be indexed properly. It doesn't matter how you display it on the screen.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO And Duplicate Content Within The Same Language
Hello, Currently, we have a .com English website serving an international clientele. As is the case we do not currently target any countries in Google Search Console. However, the UK is an important market for us and we are seeing very low traffic (almost entirely US). We would like to increase visibility in the UK, but currently for English speakers only. My question is this - would geo-targeting a subfolder have a positive impact on visibility/rankings or would it create a duplicate content issue if both pieces of content are in English? My plan was: 1. Create a geo-targeted subfolder (website.com/uk/) that copies our website (we currently cannot create new unique content) 2. Go into GSC and geo-target the folder to the UK 3. Add the following to the /uk/ page to try to negate duplicate issues. Additionally, I can add a rel=canonical tag if suggested, I just worry as an already international site this will create competition between pages However, as we are currently only targeting a location and not the language at this very specific point, would adding a ccTLD be advised instead? The threat of duplicate content worries me less here as this is a topic Matt Cutts has addressed and said is not an issue. I prefer the subfolder method as to ccTLD's, because it allows for more scalability, as in the future I would like to target other countries and languages. Ultimately right now, the goal is to increase UK traffic. Outside of UK backlinks, would any of the above URL geo-targeting help drive traffic? Thanks
Technical SEO | | Tom3_150 -
Purchasing duplicate content
Morning all, I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well. I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain. I'd love to hear your thoughts!
Technical SEO | | BackPack851 -
SEO for a a static content website
Hi everyone, We would like to ask suggestions on how to improve our SEO for our static content help website. With the release of each new version, our company releases a new "help" page, which is created by an authoring system. This is the latest page: http://kilgray.com/memoq/2015/help-en/ I have a couple of questions: 1- The page has an index with many links that open up new subpages with content for users. It is impossible to add title tags to this subpages, as everything is held together by the mother page. So it is really hard to for users to find these subpage information when they are doing a google search. 2- We have previous "help" pages which usually rank better in google search. They also have the same structure (1 page with big index and many subpages) and no metadata. We obviously want the last version to rank better, however, we are afraid exclude them from bots search because the new version is not easy to find. These are some of the previous pages: http://kilgray.com/memoq/2014R2/help-en/ http://kilgray.com/memoq/62/help-en/ I would really appreciate suggestions! Thanks
Technical SEO | | Kilgray0 -
What I am Doing Wrong with My Rank
Hi, I am doing something wrong but I don't understand what is the issue. The rank of my website for a particular keyword is by far lower than one of my competitors (they are in top 20 while I am not even in top 50 in google). I did all fixes recommended by MOZ, I have higher number of backlinks (natural built), relatively similar social media activities, bigger content but still I'm far away. Do you have any idea what is wrong? Best, Tony
Technical SEO | | Threeding.com0 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Do Domain Extensions such as .com or .net affect SEO value?
In the beginning of SEO days, it was going around that .com is the best for SEO and that .net is not as good. Is there any truth to this, and what about .org or .edu? I always hear that .edu sites have high PR. Is there any rhyme or reason to this, or all they all equal? Thank you, Afshin Christian-Way.com
Technical SEO | | applesofgold0 -
Planing Seo For New Seo
Hello; I have the domain which registerd in 2006 and i opened website 1 months ago and i start to do some seo like bought links pr1-pr7 50 links and 2500 social bookmarks 2000 blog links and also some wiki links am i doing good or bad ?
Technical SEO | | Sadullah0 -
CSS for SEO - can search engine see the text in the body?
We use CSS to arranging (absolute positioning) our content to makes it easier to crawl. I am using your On-Page Keyword Optimization tool and other tools to check our pages (i.e. http://www.psprint.com/gallery/invitation-cards), to make sure it works. For the “On-Page Keyword Optimization” tool, it gives a petty good grade (I guest it sees the text in the body). However, when I am using other tool to test the page (e.g. http://tools.seobook.com/general/spider-test/) it could not see the text in the body. Did we do something wrong? Thanks Tom
Technical SEO | | tomchu0