How useful is a mobile version of your site (for SEO sake)?
-
We're investigating a mobile version of our e-commerce site. Is it worth the investment regarding search engine optimization, or is this something that wouldn't have a big effect?
-
I doubt if phone users are serious buyers. I would make sure you have a lot of other things in place before worring too much about making money from phone users. If you have the 98% under conrol, then you can spend time and effort chacing the remaining 2%
-
Hi. I posted an answer a few days back that might help:
Google serves up the same results to smart phones and desktop computers. What they recommend is use the same site and use the style sheet to control the mobile display. In other words, not making a separate site for mobile. Here is a snippet from a Google & A.
John Mueller - @Paul If you have "smartphone" content (which we see as normal web-content, as it's generally a normal HTML page, just tweaked in layout for smaller displays) you can use the rel=canonical to point to your desktop version. This helps us to focus on the desktop version for web-search. When users visit that desktop version with a smartphone, you can redirect them to the mobile version. This works regardless of the URL structure, so you don't need to use subdomains / subdirectories for smartphone-mobile sites. Even better however is to use the same URLs and to show the appropriate version of the content without a redirect :). Here is the entire article where I found the snippet.
The other option would be to make the mobile pages and canonical those back to the corresponding main site pages. This way you don't have duplicate content and you have more SEO juice flow to the main site.
In my opinion, I wouldn't even worry too much about "traditional" cell phones. I found since the beginning of the year, on STP, we've only had 1 or 2 sales via dumb phones and only a fraction of traffic compared with smart phones.
-
Mobile browsing and shopping is on the increase, so this will probably be worth doing at some point. It may be easier to judge if now is the time if you go into Analytics and see how many of your visitors are already using mobile devices and whether you are seeing growth in this area. If you haven't already read it, this post is worth a look http://www.seomoz.org/blog/seo-for-the-ipad
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
Is it a good idea to 301 redirect one same niche site towards another site for seo benefit
Hello friends, I have 2 android niche sites, one site is running on a technology dropped domain i catch 1 year ago it has, almost 400+ domains linking to different parts of the site, the other one i established from scratch and both are running from jan 2015. Now i want to redirect first site which already has 400 links pointing towards it to the home page of my 2nd android site. Is it a good idea to do so and does it give any boost in terms of seo?
Algorithm Updates | | RizwanAkbar0 -
Use of http://schema-creator.org boost ranking
Hello all if we use http://schema-creator.org for structured html will it increase our ranking too. has it any benefit for SEO?
Algorithm Updates | | adnan11010 -
VRL Parameters Question - Exclude? or use a Canonical Tag?
I'm trying to figure something out, as I just finished my "new look" to an old website. It uses a custom built shopping cart, and the system worked pretty well until about a year when ranking went down. My primary traffic used to come from top level Brand pages. Each brand gets sorted by the shopping cart and a Parameter extension is added... So customers can click Page 1 , Page 2 , Page 3 etc. So for example : http://www.xyz.com/brand.html , http://www.xyz.com/brand.html?page=1 , http://www.xyz.com/brand.html?page=2 and so on... The page= is dynamic, therefore the page title, meta's, etc are the same, however the products displayed are different. I don't want to exclude the parameter page= completely, as the products are different on each page and obviously I would want the products to be indexed. However, at the same time my concern is that have these parameters might be causing some confusion an hence why I noticed a drop in google rankings. I also want to note - with my market, its not needed to break these pages up to target more specific keywords. Maybe using something like this would be the appropriate measure?
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Should I use the Disavow Tool at this point?
After Penguin, our site: www.stadriemblems.com jumped up to #1 for the keyword "embroidered patches." Now, months later, it's at the top pf page two. I'm pretty sure this is because we do have a few shady links (I didn't do it!) that perhaps Penguin didn't catch the first time around, but now Google is either discounting them or counting them against us. My question is, since I'm pretty sure those links are the reason we are gradually declining, should I submit them to Google as disavowed, even though technically, we're not penalized . . . yet? I have done everything possible to get them removed, and it's not happening.
Algorithm Updates | | UnderRugSwept0 -
Does Google do domain level topic modeling? If so, are off-site factors such as search traffic volume taken into account?
80% of my site's organic traffic is coming through a resource that is only somewhat related. Does Google think the main topic of my site is terms this resource targets thus bumping the terms I care about to a sub-topic level of sorts? If this is the case, would putting the resource information into a sub-domain help to solve the problem?
Algorithm Updates | | tatermarketing0 -
Google seems to have penalised one section of our site? Is that possible?
We have a page rank 5 website and we launched a new site 6 months ago in February. Initially we had horrible urls with a bunch of numbers and stuff and we since changed them to lovely human readable urls. This had an excellent effect across the site except on one section of the site: http://www.allaboutcareers.com/careers/graduate-employers Although Google has indexed these pages and several have a PR 2 they do not appear in Google when previously they were on page 1 when we had the old urls. We figured we just needed some time for Google to get used to it, but it hasn't done anything. It is also worth mentioning we changed the page titles from: FIRM NAME | DOMAIN NAME then... FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships | DOMAIN NAME then.. FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships Do you think these are being penalised? There are two types of page: Example A: http://www.allaboutcareers.com/careers/graduates/addleshaw-goddard.htm Example B: http://www.allaboutcareers.com/careers/graduates/accenture.htm
Algorithm Updates | | jack860