Followup question to rand(om) question: Would two different versions (mobile/desktop) on the same URL work well from an SEO perspective and provide a better overall end-user experience?
-
We read today's rand(om) question on responsive design. This is a topic we have been thinking about and ultimately landing on a different solution. Our opinion is the best user experience is two version (desktop and mobile) that live on one URL.
For example, a non-mobile visitor that visits http://www.tripadvisor.com/ will see the desktop (non-responsive) version. However, if a mobile visitor (i.e. iOS) visits the same URL they will see a mobile version of the site, but it is still on the same URL There is not a separate subdomain or URL - instead the page dynamically changes based on the end user's user agent.
It looks like they are accomplishing this by using javascript to change the physical layout of the page to match the user's device. This is what we are considering doing for our site.
It seems this would simultaneously solve the problems mentioned in the rand(om) question and provide an even better user experience. By using this method, we can create a truly mobile version of the website that is similar to an app. Unfortunately, mobile versions and desktop users have very different expectations and behaviors while interacting with a webpage.
I'm interested to hear the negative side of developing two versions of the site and using javascript to serve the "right" version on the same URL. Thanks for your time!
-
Hey David,
TripAdvisor doesn't use JavaScript to decide if you get the mobile version or note. The server detects your useragent and then sends you the proper version of the site (on the same URL as you noted).
Remember, JavaScript executes on your client. So the JavaScript would have to be sent to your browser and then execute before it could figure out what kind of device you were on and then render the rest of the page. That's basically how responsive design works, except that most commonly a CSS @Media Queries is used to determine the width of your viewport, and then the page is optimized for that width.
What TripAdvisor does, is what Google calles a Dynamic website. Basically the server handshakes with the browser before the page is sent, the server learns the useragent, and then sends different source code to the browser that is specific to that type of device/browser.
You can read about the google definitions, I'm referencing here: https://developers.google.com/webmasters/smartphone-sites/details
You can read a bit more about the SEO implications of the three approaches in this thread: http://www.seomoz.org/q/how-does-a-responsive-site-kill-seo
I prefer to use Dynamic websites when the user tasks are likely to be different on different devices. (i.e. Trip Advisor has a "Near Me Now" on smartphones, but not on the desktop).
I prefer Responsive Design, when my content and user tasks are going to be the same on all devices, and only the formatting/presentation is going to be the same. (such as reading a blog)
I prefer separate URLs when the Information Architecture is going to be dramatically different on different devices, and it's unlikely that a single user is going to share URLs across multiple devices. (Such as displaying a mobile boarding pass on a mobile phone, that I'd never offer on a desktop device, or scanning barcodes in a store).
In many cases, you can combine all three. I.E. detect different devices on server to send different images and menus (Dynamic). Use @media queries to optimize my content for the exact width of my current viewport (Responsive), and have a separate m.URL for mobile only pages, like that mobile boarding pass. The cool buzzword for combining responsive and dynamic is called Responsive Design with Server Side Components or RESS (I have no idea what happened to the W or C in that acronym).
I hope that helps!
-Jason "Retailgeek" Goldberg.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Class/Course/Certification
Hello, I'm located in the US and I'm looking for **good **SEO classes or courses. Something that provides certification when completed, if possible. This is something that my employer wants to do. I know MOZ offers a lot of information here and has a course as well, but I'm looking for something that is in person. I tend to learn better that way. I see some things pop up when I search, but I believe most of it is just junk. I am looking for something that would basically go from intermediate to advanced. Thank you!
Intermediate & Advanced SEO | | slecinc0 -
Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
Intermediate & Advanced SEO | | moge0 -
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
One site, two blogs, URL structure?
I address a two sided market: consumer research and school fundraising. Essentially parents answer research surveys to generate proceeds for their school. My site will have a landing page at www.centiment.co that directs users to two different sub-landing pages, one related to research and one related to school fundraising. I am going to create two blogs and I am wondering if I should run off one installation of wordpress.org or two? The goal here is to optimize SEO. Separate URL paths by topic are clean but they require two installations of wordpress.org www.centiment.co/research/blog www.centiment.co/fundraising/blog If were to use one installation of wordpress it would be www.centiment.co/blog and then I would have a category for fundraising and a category for research. This is a little simpler. My concern is that it will confuse google and damage my SEO given general blog posts about fundraising are far different then those about research. Any suggestions? Again I don't want to compromise my SEO as I'm creating a blog to improve my SEO. Any insights are much appreciated. Thank you!
Intermediate & Advanced SEO | | kurtw14
Kurt0 -
Persistent listings or 301 redirects better for SEO?
Imagine these 2 scenarios for an ecommerce listing. 1. A listing that only closes once stock runs out 2. A listing that relists every 7 days assuming stock has run out and doing a 301 redirect to the latest version of that listing (imagine it relists several times) You might ask why on earth we would have the 2nd scenario, but we are an auction site where some listings can't be bid on. In other words those Buy Now only listings are also part of the auction model - they close after 7 days. For me it is a no-brainer that scenario 1 is better for SEO, and I have my ideas on why this is better for SEO than the second scenario such as age, SERP CTR, link equity not being diluted by 301 redirects not changing every 7 days when the listing relists multiple times etc. I was wondering if someone could articulate better than I possibly could why scenario 1 is better for SEO, and why scenario 1 would rank better in the SERPS....would it? Many thanks! Cheers, Simon
Intermediate & Advanced SEO | | sichristie0 -
Quick Question: Is it Bad for SEO to paste from Word to your CMS?
Hey just a quick question I'm having trouble finding a definitive answer to: Is the markup that is transferred from Word docs bad for SEO? We are managing to paste it and it looks fine, but the developers are worried that the extra code will be bad for SEO. Does anyone have solution besides pasting into Text Editor and formatting in the CMS? Is this necessary or can we just leave the extra code? Thank you!
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Does this work as a tactic for including keyword in URL structure
Howdy, I'm planning out a website and need to plan out the URL structure for best SEO value. Generally I would do something like this:
Intermediate & Advanced SEO | | IrvCo_Interactive
site.com/widgetssite.com/widgets/large
site.com/widgets/large/blue
etc. I think this is a pretty straight forward SEO tactic. The issue I have with it is in terms of natural language the "thing" you are searching for in this case is a widget, so typically you would type/search [adjective] [noun], or in this case "large blue widgets." So one proposal I have is to instead append the "widget" to the end of the URL:
site.com/large-widgets
site.com/large/blue-widgets
site.com/large/blue/square-widgets
etc. Obviously this breaks the whole silo concept since the square-widgets page is inside the /blue directory but the blue widgets page isn't at /blue it is /blue-widgets. My solution is to setup 301 redirects from /blue to /blue-widgets (even thought there are no site links pointing to that page). Does this seem like a good idea? Or does this break the whole folder silo concept? What I like about it is that it feels more user friendly in terms of natural language and for certain high value keywords we can get certain pairings of words into the URL more like how a person would type them in.0 -
Two sites, two domains, two brands, 98% same content
There are two affiliated brick & mortar retail stores moving into e-commerce. For non-marketing reasons separate e-commerce websites are desired. The two brands are based in separate (nearby) cities in the same Canadian province. Although the store name and branding will be different, the content on the site will either be near duplicates or exact duplicates. The more I look into this on Google and SEOmoz QA, the more I am concerned about the SEO implications of this. SEOmoz QA: Multiple cities/regions websites - duplicate content? "So, yes, because you are offering the same services at second location, you are thinking correctly about the need to rewrite all content so it's not a duplicate of site #1." Duplicate content - Webmaster Tools Help "However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic… In the rare cases in which Google perceives that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we’ll also make appropriate adjustments in the indexing and ranking of the sites involved. As a result, the ranking of the site may suffer, or the site might be removed entirely from the Google index, in which case it will no longer appear in search results. ... Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results." Unfortunately, I would say there's very little chance that rewritten content will happen in the foreseeable future. With that said, I'd be greatly appreciative of the concerns and remedies that the SEOmoz community has to offer (even if they're for future use). Thanks in advance.
Intermediate & Advanced SEO | | GOODSIR0