Google's Stance on "Hidden" Content
-
Hi,
I'm aware Google doesn't care if you have helpful content you can hide/unhide by user interaction. I am also aware that Google frowns upon hiding content from the user for SEO purposes. We're not considering anything similar to this.
The issue is, we will be displaying only a part of our content to the user at a time.
We'll load 3 results on each page initially. These first 3 results are static, meaning on each initial page load/refresh, the same 3 results will display. However, we'll have a "Show Next 3" button which replaces the initial results with the next 3 results. This content will be preloaded in the source code so Google will know about it.
I feel like Google shouldn't have an issue with this since we're allowing the user action to cycle through all results. But I'm curious, is it an issue that the user action does NOT allow them to see all results on the page at once?
I am leaning towards no, this doesn't matter, but would like some input if possible. Thanks a lot!
-
I don't think you're looking at a penalty situation, if that's what you are asking. Seems perfectly legitimate.
The more interesting question to me is how Google will "weigh" the hidden content in it's algorithm. I suspect that anything that is hidden by javascript (or another method) will hold less weight than text in plain sight. You could try Google's new "Fetch and Render" tool in Webmaster Tools to see how Google views the page. Anything that doesn't display might not get as much consideration as plain text.
Of course, this is a lot of speculation. We don't really know for sure how Google treats text like this, but it's a pretty common situation.
-
Anyone else want to take a crack at this?
-
Hi Alrockn,
I'm not sure you understood the question. Thank you for reading.
-
Sounds similar to paginate issues, and the potential to create duplicate content in the eye's of google, particularly if you're using a template. Not a serious issue if this occurs for 1 or 2 clicks, but if viewer are going to do this for 5 or more times after the initial landing page, it might be a problem with the meta-tags.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Investigating Google's treatment of different pages on our site - canonicals, addresses, and more.
Hey all - I hesitate to ask this question, but have spent weeks trying to figure it out to no avail. We are a real estate company and many of our building pages do not show up for a given address. I first thought maybe google did not like us, but we show up well for certain keywords 3rd for Houston office space and dallas office space, etc. We have decent DA and inbound links, but for some reason we do not show up for addresses. An example, 44 Wall St or 44 Wall St office space, we are no where to be found. Our title and description should allow us to easily picked up, but after scrolling through 15 pages (with a ton of non relevant results), we do not show up. This happens quite a bit. I have checked we are being crawled by looking at 44 Wall St TheSquareFoot and checking the cause. We have individual listing pages (with the same titles and descriptions) inside the buildings, but use canonical tags to let google know that these are related and want the building pages to be dominant. I have worked though quite a few tests and can not come up with a reason. If we were just page 7 and never moved it would be one thing, but since we do not show up at all, it almost seems like google is punishing us. My hope is there is one thing that we are doing wrong that is easily fixed. I realize in an ideal world we would have shorter URLs and other nits and nats, but this feels like something that would help us go from page 3 to page 1, not prevent us from ranking at all. Any thoughts or helpful comments would be greatly appreciated. http://www.thesquarefoot.com/buildings/ny/new-york/10005/lower-manhattan/44-wall-st/44-wall-street We do show up one page 1 for this building - http://www.thesquarefoot.com/buildings/ny/new-york/10036/midtown/1501-broadway, but is the exception. I have tried investigating any differences, but am quite baffled.
Intermediate & Advanced SEO | | AtticusBerg10 -
Dates in the URLs for a "hot" content website (tipping service)
Hi, I'm planning to build a website that will present games previews for different sports. I think that the date should be included in the URL as the content will be valuable until the kick off f the game. So first i want to know if this is the right approach and second the URL structure i have imagined is /tips/sport/competition/year/month/day Ex : /tips/football/premier_league/2013/11/05 Is this a good structure ? Guillaume.
Intermediate & Advanced SEO | | betadvisor0 -
Google WMT Showing Duplicate Content, But There is None
In the HTML improvements section of Google Webmaster Tools, it is showing duplicate content and I have verified that the duplicate content they are listing does not exist. I actually have another duplicate content issue I am baffled by, but that it already being discussed on another thread. These are the pages they are saying have duplicate META descriptions, http://www.hanneganremodeling.com/bathroom-remodeling.html (META from bathroom remodeling page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Bathroom Remodeling Washington DC, Bathroom Renovation Washington DC, Bath Remodel, Northern Virginia,DC, VA, Washington, Fairfax, Arlington, Virginia</a>" /> http://www.hanneganremodeling.com/estimate-request.html (META From estimate page) <meta name="<a class="attribute-value">description</a>" content="<a class="attribute-value">Free estimates basement remodeling, bathroom remodeling, home additions, renovations estimates, Washington DC area</a>" /> WlO9TLh
Intermediate & Advanced SEO | | WebbyNabler0 -
How to find all of a website's SERPs?
Was wondering how easiest to find all of a website's existing SERPs?
Intermediate & Advanced SEO | | McTaggart0 -
Is it worth submitting a blog's RSS feed...
to as many RSS feed directories as possible? Or would this have a similar negative impact that you'd get from submitting a site to loads to "potentially spammy" site directories?
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Charity project for local women's shelter - need help: will Google notice if you alter the document title with Javascript after the page loads?
I am doing some pro-bono work with a local shelter for female victims of domestic abuse. I am trying to help visitors to the site cover their tracks by employing a document.title change when the page loads using JavaScript. This shelter receives a lot of traffic from Google. I worry that the Google bots will see this javascript change and somehow penalize this site or modify the title in the SERPs. Has anyone had any experience with this kind of javascript maneuver? All help would be greatly appreciated!
Intermediate & Advanced SEO | | jkonowitch0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9