Mobile Redirect - Cloaking/Sneaky?
-
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation.
We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists.
Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc.
My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers.
What's the MOZ Community say about this? Cloaking or Not and Why?
Thanks!
-
Thomas
great info above, quick follow up question for you.....I have always wonder why is Google using the "640px" in the ?
many people have been asking the question lately if 640px is an old example or is it the required size? (phones are larger nowaway)the website I manage are non-responsive ocated in a /mobile/ folder such as: http://www.example.com/mobile/page1
and our mobile size cut off is actually 1023px..... should we be using 640 or 1023px in the rel="alternate" tag?
thank you! -
Thanks Thomas, I've pushed back and said no, part of my original SEO requirement was to eliminate the blanket redirect but there's always pushback and wanted to have more ammo in my back pocket.
Definitely will be implementing vary http header, etc when we do have mobile version. I did not know about the apex/ CNAME/ Aname, I appreciate the tip.
-
"Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc."
Tell your developer absolutely not and create the multiple versions of the site then redirect them properly if he does what is stated below your site will lose visitors and Google will be less than happy.
I strongly suggest that you tell him no. The only thing he has right is redirect to the mobile version if it exists. If it does not exist do not redirect to the homepage or any page UNLESS IT IS THE mobile version of that original page.
If they find it via search Google has already deemed it not mobile friendly any URLs that are up for debate place through this: https://varvy.com/mobile/ and you will have your answer brother their mobile friendly or not
if there is not a valid mobile version you should not force the mobile version to be used it will not benefit you it will hurt you in fact.
-
if there is not a valid mobile version you should not force the mobile version to be used it will not benefit you it will hurt you in fact. when you do have a 100% mobile friendly version you can utilize the tactics below
Different methods apex records or Aname records /Cname flattening whatever you want to call it can do the trick as well as see below.
Cname flattening
https://support.cloudflare.com/hc/en-us/articles/200168336-About-CloudFlare-Mobile-Redirect
"All mobile traffic to
example.com
(the root/zone apex) andwww.example.com
is redirected to the mobile-optimized home page. Those records (root and www) must have CloudFlare's performance service enabled ("orange cloud" in the DNS Settings) for the redirect to be active."Add
https://varvy.com/mobile/vary-user-agent.html
Vary: User-Agent
Desktop page
Mobile page
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO yes/no?
We receive links from fake websites, these website are copy's from real websites that link to us, but sometimes the links are changes, as for example one link is called 'tank weapon with hitler', we are a insurance comparison website (a bit of topic). The real websites that link to us are copied and placed on .ga .tk etc domains: For example: wahlrsinnsa.ga, loungihngsa.ga, pajapritosa.cf, rgeitsportsa.cf, sospesvoasa.tk I received spam links on other domains with comments spam etc, this doesnt really work, but in this case we really suffer in our rankings (from position 1 to 5 etc). Not sure if this is negative SEO and if this is really the reason we lost some rankings, but it's a bit of a coincidence the domains come in google webmaster in the same period we suffer a downgrade in our rankings. My question: Is this negative SEO, or is it something automatic. And do I need to disavow the links/domains? The real versions of the websites (on other domains with .nl) give the website autority.
White Hat / Black Hat SEO | | remkoallertz0 -
Deleting 301 Redirect URLs from the CMS
Hi Everyone, Would there be a negative SEO effect from deleting pages with 301 redirects in your CMS? Does anyone know of an average time of authority transfer from a redirect? Thanks,
White Hat / Black Hat SEO | | JMSCC
Jon0 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Cloaking for better user experience and deeper indexing - grey or black?
I'm working on a directory that has around 800 results (image rich results) in the top level view. This will likely grow over time so needs support thousands. The main issue is that it is built in ajax so paginated pages are dynamically generated and look like duplicate content to search engines. If we limit the results, then not all of the individual directory listing pages can be found. I have an idea that serves users and search engines what they want but uses cloaking. Is it grey or black? I've read http://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful and none of the examples quite apply. To allow users to browse through the results (without having a single page that has a slow load time) we include pagination links but which are not shown to search engines. This is a positive user experience. For search engines we display all results (since there is no limit the number of links so long as they are not spammy) on a single page. This requires cloaking, but is ultimately serving the same content in slightly different ways. 1. Where on the scale of white to black is this? 2. Would you do this for a client's site? 3. Would you do it for your own site?
White Hat / Black Hat SEO | | ServiceCrowd_AU0 -
301 redirect a set of pages to one landing page/URL?
I'm planning to redirect the following pages to one new URL/landing page: Old URLs: http://www.olddomain.com/folder/page/1 http://www.olddomain.com/folder/page/2 http://www.olddomain.com/folder/page/3 http://www.olddomain.com/folder/page/4 http://www.olddomain.com/folder/page/5 http://www.olddomain.com/folder/page/6 New URL: http://www.newdomain.com/new-folder/new-page Code in .htaccess that I will be using: RedirectMatch 301 /folder/page/(.*) http://www.newdomain.com/new-folder/new-page Let me know if this is correct. Thanks!
White Hat / Black Hat SEO | | esiow20130 -
Some pages of my website http://goo.gl/1vGZv stopped crawling in Google
hi , i have 5 years old website and some page of my website http://goo.gl/1vGZv stopped indexing in Google . I have asked Google webmaster to remove low quality link via disavow tool . What to do ?
White Hat / Black Hat SEO | | unitedworld0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2 -
Redirects/What to do with multi domains for the same company?
What is the correct way to "redirect" a domain if you have multi domain names for the same site? For example if a company has www.mysite.com www.mysite.info www.mysite.tv www.mysite+location.com Say my website lived at this location www.mysite.com would I then just forward the other domains to the same place? Do search engines penilize for this? Do search engines view this as duplicated content? Is it even worth having these domains and making the active? Thanks in advance!
White Hat / Black Hat SEO | | christinarule0