Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Hiding content or links in responsive design
-
Hi,
I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that.
Google says:
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/detailsFor usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none")
Is this counted as hidden content and could penalize your site or not?
What do you guys do when you create responsive design websites?
Thanks!
GaB
-
Hi,
Saijo and Bradley are right in saying that hiding elements on a smaller screen should not be an issue (as it's a correct implementation of responsive design). Bear in mind as well that there is a Googlebot and a Smartphone Googlebot, so as long as the Googlebot is seeing what desktop users see and the Smartphone Googlebot (which uses an iPhone5 user agent) is seeing what mobile users see, it shouldn't be a problem.
The only thing I would add:
If you are going to use display:none to prevent a user from seeing something when they view your site, it's good to include an option to 'view full site' or 'view desktop site'. Also in that case I would question whether you actually need that content on the desktop site at all? Because best practice is to provide all the same content regardless of device.
If it's hidden but still accessible to the mobile user (in a collapsible div for instance) there's no cloaking involved so it shouldn't cause a problem.
As a side note: the Vary HTTP header is really for a dynamically served website (that is, a single URL which checks user agent and then serves the desktop HTML to desktop devices and mobile HTML to mobile devices).
Hope that helps!
-
The way I see it.
Google does not have a problem with proper use of things like media queries. More info : https://developers.google.com/webmasters/smartphone-sites/details . They ONLY have problem with webmasters when the hidden text is only available to search engines for SERP manipulation.
Read more into the " The Vary HTTP header " bit in the link above and some info from Matt : http://www.youtube.com/watch?v=va6qtaiZRHg&feature=player_detailpage#t=219
-
I understand what you are referring to about having to hide certain elements on smaller screens. Sometimes not everything fits or flows correctly.
When this happens, however, I try to hide design elements as opposed to text or links. I'm also OK with hiding images. If a block of text or a link seems out of place or doesn't flow properly, I will build a dropdown for it. I'm sure you've seen mobile sites with dropdown navigation menus.
I wouldn't leave it to up to Google to interpret what you are doing. Don't hide any links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Too many dofollow links = penalty?
Hi. I currently have 150 backlinks, 90% of them are dofollow, while only 10% are nofollow. I recently hit position #10 for my main keyword, but now it is dropped to #16 and a lot of related keywords are gone. So I have a few questions: 1. Was my website penalized for having an unnatural backlink profile (too many dofollow links), or maybe this drop in positions is just a temporary, natural thing? 2. Isn’t it too late for making the backlink profile look more natural by building more nofollow backlinks and making it 50%/50%? Thank you!
White Hat / Black Hat SEO | | NathalieBr0 -
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Hiding Elements on Mobile. Will this effect SEO.
Hey guys and gals, I am hiding elements with @media sizes on the mobile experience for this site. http://prepacademyschools.org/ My question is when hiding elements from mobile, will this have a negative effect on rankings for mobile and or desktop? Right now it is a hero banner and testimonial. My interest is because I feel responsive is now working against conversions when it comes to mobile because desktop typically has the same info several times where mobile it can be repetitive and only needed once. Thanks,
White Hat / Black Hat SEO | | brightvessel1 -
Are All Paid Links and Submissions Bad?
My company was recently approached by a website dedicated to delivering information and insights about our industry. They asked us if we wanted to pay for a "company profile" where they would summarize our company, add a followed link to our site, and promote a giveaway for us. This website is very authoritative and definitely provides helpful use to its audience. How can this website get away with paid submissions like this? Doesn't that go against everything Google preaches? If I were to pay for a profile with them, would I request for a "nofollow" link back to my site?
White Hat / Black Hat SEO | | jampaper1 -
Disavow - Broken links
I have a client who dealt with an SEO that created not great links for their site. http://www.golfamigos.co.uk/ When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach. I just wondered what the consensus was?
White Hat / Black Hat SEO | | lauratagdigital0 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0 -
Why would links that were deleted by me 3 months ago still show up in reports?
I inadvertently created a mini link farm some time back by linking all of my parked domains (2000 plus) to some of my live websites (I was green and didn't think linking between the same owner sites / domains was an issue). These websites were doing well until Penguin and although I did not get any 'bad link' advices from Google I figure I was hit by Penguin. So about 3 or 4 months ago I painstakingly deleted ALL links from all of those domains that I still own (only 500 or so - the others were allowed to lapse). None of those domains have any links linking out at all but old links from those domains are still showing up in WMT and in SEOmoz and every other link tracking report I have run. So why would these links still be reported? How long do old links stay in the internet archives? This may sound like a strange question but do links 'remain with a domain for a given period of time regardless'? Are links archived before being 'thrown out' of the web. I know Google keeps archives of data that has expired, been deleted, website closed etc, etc for about 3 years or so (?). In an effort to correct a situation I have spent countless hours manually deleting thousands of links but they won't go away. Looking for some insight here please. cheers, Mike
White Hat / Black Hat SEO | | shags380 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0