Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will Google Penalize Content put in a Div with a Scrollbar?
-
I noticed Moosejaw was adding quite a bit of content to the bottom of category pages via a div tag that makes use of a scroll bar. Could a site be penalized by Google for this technique?
Example: http://www.moosejaw.com/moosejaw/shop/search_Patagonia-Clothing____
-
I see this question has been answered years back from now. But what's the importance of this issue in today's world.
I just got a client's website and he want to add SEO optimized content in the scroll bar at the bottom of the page. I don't know if that's a spam or not. Can you please suggest me.
I'm eager to get a proper answer.
website is: www (dot) zdhsales (dot) com
-
I've actually wondered the same before. To the best of my knowledge I've never heard anyone cite overflow: auto; as a negative signal compared to the amount of press display: none; text-indent: -9999px; etc. gets. It very well could be abused just as badly though. The only way I could think of an abuse-check would be to weigh the amount of text in the corresponding div against what a practical min-height of that div should be, but that seems a bit excessive.
I agree with Steven, it's come to a point where these css techniques have very legitimate uses and probably shouldn't be penalized. Plus, there's plenty of other ways to accomplish the same thing, whether it's document tree manipulation or any other kind of rendering of a page after the crawable URL has been loaded. So at what point is it worth fighting such a thing?
edit: on a side note, what's the deal with those crazy underscores at the end of the URL? yuck.
-
Do Google actually still penalised Overflow:Hidden and Display:none though still, or just off screen placement such as left:-9999px? If they do its something that I'm sure will be changed as its commonly used for "div switching" through navigational menu's and tabs (for display:none at least).
-
Thank you for the response Ryan. Although the site is not outwardly "hiding" the copy, from a usability standpoint this method does not seem to carry much if any value to the person visiting the page. I figured Google would see this as a lame attempt at search engine bate and frown upon the practice.
-
To the best of my knowledge this has no impact on SEO. Googlebot doesn't like it when you hide content, but that only applies to overflow:hidden and display:none as far as I know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content multiple blogs or multiple website - 2018
Submitting same content on multiple site or blog using original source Links. Its good or bad in term on Ranking and SEO. Can we post same content on multiple website with orginal post reference same like Press release site technique.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Pages mirrored on unknown websites (not just content, all the HTML)... blackhat I've never seen before.
Someone more expert than me could help... I am not a pro, just doing research on a website... Google Search Console shows many backlinks in pages under unknown domains... this pages are mirroring the pages of the linked website... clicking on a link on the mirror page leads to a spam page with link spam... The homepage of these unknown domain appear just fine... looks like that the domain is partially hijacked... WTF?! Have you ever seen something likes this? Can it be an outcome of a previous blackhat activity?
White Hat / Black Hat SEO | | 2mlab0 -
I'm changing title tags and meta tags, url, will i loose my ranking?
Hi Guys QUESTION: I'm currently going through a re-design for my new website that was published in November 2014 - since launching we found there were many things we needed to change, our pages were content thin being one of the biggest. I had industry experts that came in and made comments on the title tags lacking relevance for eg: our title tag for our home page is currently "Psychic Advice" most ideal customers don't search "Psychic Advice" they search more like "Online Psychic Reading" or Psychic Readings" I noticed alot of my competitors also were using title tags such as Online Psychic Readings, Free Psychic Readings etc so it brings me to my question of "changing the title tags around. The issue is, im ranking for two keywords in my industry, online psychics and online psychic readings in NZ. 1. Our home page and category pages are content thin.... so hoping that adding the changes will create perhaps some consistency also with the added unique and quality content. Here is the current website: zenory. co.nz and the new one is www.ew-zenory.herokuapp.com which is currently in development I have 3 top level domains com,com.au, and co.nz Is there anyone that can give me an idea if I were to change my home page title tag to **ZENORY | Online Psychic Readings | Live Psychic Phone and Chat ** If this will push my rankings down though this page will have alot more valuable content etc? For obvious reasons im going to guess it will make drop, I'm wondering though if it is worth changing the title tags and meta descriptions around or leaving it as is if its already doing well? How much of a difference do title tags and meta descriptions really make? Any insight into this would be great! Thanks
White Hat / Black Hat SEO | | edward-may1 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0