Followup question to rand(om) question: Would two different versions (mobile/desktop) on the same URL work well from an SEO perspective and provide a better overall end-user experience?
-
We read today's rand(om) question on responsive design. This is a topic we have been thinking about and ultimately landing on a different solution. Our opinion is the best user experience is two version (desktop and mobile) that live on one URL.
For example, a non-mobile visitor that visits http://www.tripadvisor.com/ will see the desktop (non-responsive) version. However, if a mobile visitor (i.e. iOS) visits the same URL they will see a mobile version of the site, but it is still on the same URL There is not a separate subdomain or URL - instead the page dynamically changes based on the end user's user agent.
It looks like they are accomplishing this by using javascript to change the physical layout of the page to match the user's device. This is what we are considering doing for our site.
It seems this would simultaneously solve the problems mentioned in the rand(om) question and provide an even better user experience. By using this method, we can create a truly mobile version of the website that is similar to an app. Unfortunately, mobile versions and desktop users have very different expectations and behaviors while interacting with a webpage.
I'm interested to hear the negative side of developing two versions of the site and using javascript to serve the "right" version on the same URL. Thanks for your time!
-
Hey David,
TripAdvisor doesn't use JavaScript to decide if you get the mobile version or note. The server detects your useragent and then sends you the proper version of the site (on the same URL as you noted).
Remember, JavaScript executes on your client. So the JavaScript would have to be sent to your browser and then execute before it could figure out what kind of device you were on and then render the rest of the page. That's basically how responsive design works, except that most commonly a CSS @Media Queries is used to determine the width of your viewport, and then the page is optimized for that width.
What TripAdvisor does, is what Google calles a Dynamic website. Basically the server handshakes with the browser before the page is sent, the server learns the useragent, and then sends different source code to the browser that is specific to that type of device/browser.
You can read about the google definitions, I'm referencing here: https://developers.google.com/webmasters/smartphone-sites/details
You can read a bit more about the SEO implications of the three approaches in this thread: http://www.seomoz.org/q/how-does-a-responsive-site-kill-seo
I prefer to use Dynamic websites when the user tasks are likely to be different on different devices. (i.e. Trip Advisor has a "Near Me Now" on smartphones, but not on the desktop).
I prefer Responsive Design, when my content and user tasks are going to be the same on all devices, and only the formatting/presentation is going to be the same. (such as reading a blog)
I prefer separate URLs when the Information Architecture is going to be dramatically different on different devices, and it's unlikely that a single user is going to share URLs across multiple devices. (Such as displaying a mobile boarding pass on a mobile phone, that I'd never offer on a desktop device, or scanning barcodes in a store).
In many cases, you can combine all three. I.E. detect different devices on server to send different images and menus (Dynamic). Use @media queries to optimize my content for the exact width of my current viewport (Responsive), and have a separate m.URL for mobile only pages, like that mobile boarding pass. The cool buzzword for combining responsive and dynamic is called Responsive Design with Server Side Components or RESS (I have no idea what happened to the W or C in that acronym).
I hope that helps!
-Jason "Retailgeek" Goldberg.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Strategy - Content/Outreach/Links
Hi everyone I'm trying to prioritise my tasks for 2018 & wondered if anyone had any useful templates they use? In terms of SEO tasks, my priority was going to be content/outreach/links - Focusing on user guides/blogs onsite Then outreach articles/some PR that doesn't go against Google guidelines offsite. My struggle with the onsite content/blogs we produce is we have no real social media plan/manager so my content outreach always seems hampered by this. I've tried taking on some of the social stuff, but this ends up being too much for just me to do. I wondered if there were any other SEOs who face this issue and who have found some good solutions? I'm stuck in a bit of a rut and can't seem to effectively push forward with outreach/content writing. Thank you Becky
Intermediate & Advanced SEO | | BeckyKey1 -
Urgent: Any point having /au version of the website for Australia?
Hi, We just migrated our website from /uk to the global one (but we still kept /us). We are expanding our business to Australia. Is there any point having the global .com site duplicated as .com/au provided the content will be identical? What's the /au impact on the domain strength and rank in Australia in comparison to having just .com. Is there any point? Anyone has direct experience? What's the best practice? Many thanks for the answers. Katarina
Intermediate & Advanced SEO | | Katarina-Borovska1 -
URL Parameters Settings in WMT/Search Console
On an large ecommerce site the main navigation links to URLs that include a legacy parameter. The parameter doesn’t actually seem to do anything to change content - it doesn’t narrow or specify content, nor does it currently track sessions. We’ve set the canonical for these URLs to be without the parameter. (We did this when we started seeing that Google was stripping out the parameter in the majority of SERP results themselves.) We’re trying to best strategize on how to set the parameters in WMT (search console). Our options are to set to: 1. No: Doesn’t affect page content’ - and then the Crawl field in WMT is auto-set to ‘Representative URL’. (Note, that it's unclear what ‘Representative URL’ is defined as. Google’s documentation suggests that a representative URL is a canonical URL, and we've specifically set canonicals to be without the parameter so does this contradict? ) OR 2. ‘Yes: Changes, reorders, or narrows page content’ And then it’s a question of how to instruct Googlebot to crawl these pages: 'Let Googlebot decide' OR 'No URLs'. The fundamental issue is whether the parameter settings are an index signal or crawl signal. Google documents them as crawl signals, but if we instruct Google not to crawl our navigation how will it find and pass equity to the canonical URLs? Thoughts? Posted by Susan Schwartz, Kahena Digital staff member
Intermediate & Advanced SEO | | AriNahmani0 -
301 redirect with /? in URL
For a Wordpress site that has the ending / in the URL with a ? after it... how can you do a 301 redirect to strip off anything after the / For example how to take this URL domain.com/article-name/?utm_source=feedburner and 301 to this URL domain.com/article-name/ Thank you for the help
Intermediate & Advanced SEO | | COEDMediaGroup0 -
Two homepage urls
We have two different homepages for our website. One is designed for daytime users (i.e. businesses), whereas the second night version is designed with home consumers in mind. Is this hurting our SEO by having two homepage urls, instead of just building a strong presence around one? We have set up canonical meta on each one: On the night version: domain.com/indexnight.html we have a On the day version: domain.com/index.html we have a It seems to me that we should just choose one of them and set up a permanent 301 redirect from one to the other. Any assistance would be greatly appreciated, thank you!
Intermediate & Advanced SEO | | JessieT0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
Hola, Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance: Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc. Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm Option 4: Something else? As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again -Chris
Intermediate & Advanced SEO | | Bacon0 -
Is My Competitor Beating Me With A Better URL Structure?
A competitor is consistently beating my website on non-competitive, long tail keywords. His DA is 32 compared to my 46. His average PA is 23 to my 28. His average On Page Optimization Grade is a C compared to my A. His page speed score using YSlow is a 71 compared to my 78. The only thing I can think of at this point is that he has a better URL structure. We both have the keyword in the URL, but his structure goes like this (keyword: apw wyott parts): www.competitor.com/apw-wyott/parts While mine goes like this (I had nothing to do with this site's architecture; this is what I'm stuck with for the time being): http://www.etundra.com/APW_Wyott_Parts-C347.html It should be noted that the last word in these keywords is always the same - "parts." These keywords are for parts by different manufacturers so they follow a consistent pattern: [manufacturer-name] followed by "parts." Also, the "C347" on the end of my URL is the category number given to this particular category of products in our database. Are his URLs beating me or should I continue to look for other factors? If so, what other factors should I consider?
Intermediate & Advanced SEO | | eTundra0