Google Mobile Algorithm update
-
Hi there,
On April the 21st Google seems to going to update their Mobile algorithm. I have a few questions about this one.
- Our current mobile website is very mobile friendly.
- We block all mobile pages with a noindex, so the desktop pages have been indexed on mobile devices.
- We use a redirect from desktop page to mobile page when someone hits a result on a mobile device.
My gut tells me this is not April 21st-proof so I'm thinking about an update to make this whole thing adaptive. By making the thing adaptive, our mobile pages will be indexed instead of the desktop pages. Two questions:
- Will Google treat the mobile page as a 100% different page than the desktop page? Or will it match those two because everything will tell Google those belong together. In other words: will the mobile page start with a zero authority and will pages lose good organic positions because of authority or not?
- Which ranking factor will be stronger after April 21st for mobile pages: page authority or mobile friendliness? In other words: is it worth ignoring the 21 April update because the authority of the desktop pages is more important than making every page super mobile friendly?
Hope to get some good advice!
Marcel
-
Hi Dirk,
That sounds great. Thanks for your help, I am going for action on this solution.
Marcel
-
Marcel
with this setup Google considers your mobile site as the mobile view of your desktop site so in terms of authority it should not have an impact on authority. Most sites we have in this setup have same rankings for mobile & desktop searches.
rgds
Dirk
-
Hi DC1611 & Matt-POP,
Thank you very much for your response. This one is clear to me, I have exactly 1 month to fix this. Mobile traffic plays a big part in our daily revenue so this is serious.
@Matt-POP: your advice is to go for responsive but I rather go for a canonical/ adaptive solution because a responsive website is not really an option (causes lots of other 'challenges'). Would you also advice to go for the canonical/adaptive way? The idea to exchange links in the source between mobile and desktop. Is will look like this:
The bold parts will be filled in dynamically.
So again (@DC1611): will the mobile page receive (some of) the page authority of the desktop version this way?Tnx again.
Marcel -
Your gut is right. If you are blocking your "very mobile friendly" pages to Googlebot-mobile you will likely end up with a big mess. If your desktop site is NOT responsive and is indexed for many terms on mobile, I would think you're going to lose those terms.
If you ignore the April 21 update and have a desktop-only site indexed you are definitely going to up for a traffic drop, especially from mobile.
We have clients with 1-3% mobile (industrial services, crane hire, that stuff) and those with 50%+ mobile (a few beauty salons, ecommerce stores, etc.) and I've been telling our clients internally - if you have 3-5% mobile traffic look, it's probably not the end of the world if it takes you until June 1st to make the switch. If you have 20-30% mobile traffic, get it fixed asap. And if you're over 30% mobile traffic you absolutely cannot afford not to have a responsive site up by April 21st. So it's a priority - but how much of one may depend on your business.
-
Blocking your mobile pages for indexing is not the best strategy if you have a dedicated mobile site. Better to use canonicals to point to the main domain - full explanation on "how to" here: https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/separate-urls
With this setup - Google associates both Mobile & Desktop version - so in terms of page authority this should be equal. I would not ignore the 21st April date- Google is trying to make a point here - so for mobile searches mobile friendliness will be first priority.
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is good for SEO update blog post dates after update post content
Hello I am updating some posts of my Blog, adding new and fresh content and rewriting some of the existing. After doing that I am thinking to update de post publishing so that I appears on front page of the blog and user can read ir again. But I don't know if it is good for google to change the publishing date of the post that he had indexed 5 years ago. Also I don't know if google will read it again if it is old and see the new changes in order to improve it in search results
Algorithm Updates | | maestrosonrisas0 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Struggling with Google Bot Blocks - Please help!
I own a site called www.wheretobuybeauty.com.au After months and months we still have a serious issue with all pages having blocked URLs according to Google Webmaster Tools. The 404 errors are returning a 200 header code according to the email below. Do you agree that the 404.php code should be changed? Can you do that please ? The current state: Google webmaster tools Index Status shows: 26,000 pages indexed 44,000 pages blocked by robots. In late March, we implemented a change recommended by an SEO expert and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned. The no of pages that are displayed in a google search request of www.google.com.au where the query was ‘site:wheretobuybeauty.com.au’ is 37,000: This new site has been re-crawled over last 4 weeks. About the site This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster tools robots.txt file has been modified several times: Firstly we had none Then we created one but were advised that it needed to have this current content: User-agent: * Disallow: Sitemap: http://www.wheretobuybeauty.com.au/sitemap.xml
Algorithm Updates | | socialgrowth0 -
Related Searches in Google
Hello, We're helping a client remove/minimize some negative information about their brand in Google's search results. Just curious about your take on if the related searches that appear at the bottom of Google search results can in any way be influenced or if it is more a combination of so many factors that any one person or organization wouldn't be able to change very easily? I've heard the related results could be influenced if enough queries generated overtake the "negative" queries done initially but I feel like that is venturing into black hat land a bit. thanks -Mike
Algorithm Updates | | mattmainpath0 -
Google changing case of URLs in SERPs?
Noticed some strange behavior over the last week or so regarding our SERPs and I haven't been able to find anything on the web about what might be happening. Over the past two weeks, I've been seeing our URLs slowly change from upper case to lower case in the SERPs. Our URLs are usually /Blue-Fuzzy-Widgets.htm but Google has slowly been switching them to /blue-fuzzy-widgets.htm. There has been no change in our actual rankings nor has it happened to anyone else in the space. We're quite dumbfounded as to why Google would choose to serve the lower case URL. To be clear, we do not build links to these lower case URLs, only the upper. Any ideas what might be happening here?
Algorithm Updates | | Natitude0 -
Anyone have stats on numbers of Google users searching while logged in?
In light of Google's recent "social search update", I am curious to know how many Google users perform searches while logged into their Google account thereby showing "social results".
Algorithm Updates | | Gyi0 -
Conveying Farmer Update To Client
I work with a site that saw their super competitive top terms drop off page one with the Farmer update. So, #4 to #12.... that kinda thing. In the last year they've added a huge catalog of 500,000 item pages. The catalog has climbed to a 76% bounce rate, where as the handful of top pages is in the 20s +/-. To date, I haven't had much of anything to do with the catalog. That makes for a sitewide average bounce rate of almost 70% which has almost doubled in the past year as the catalog has ramped up. The catalog gets a ton of search traffic and sells a lot of items via that organic traffic. I'm advocating for a variety of measures, including cleaning up the catalog: 301ing out of stock pages to the homepage 301ing 100% bounce rate pages who've had hundreds/thousands of visits over time.. Improving the user experience. Offering rainchecks for out of stock items. They generally don't believe that the huge bounce rate (bad user experience stats) is hurting their top terms on their top pages. They see it as two different issues. Any thoughts on how to present evidence that the catalog is the culprit? In researching it, I found these two quotes: "In particular, it's important to note that low quality pages on one part of a site can impact the overall ranking of that site," the Google spokesman said. and... "Google spokesman told PCMag that sites that believe they have been adversely impacted should "extensively evaluate their site quality." Not only that, but the item descriptions are straight from the manufacturer, so the pages aren't that unique text-wise. Any industry standard on catalog page bounce rates? Not that it's the only possible area of SEO improvement, because it's not. I thought those quotes were pretty conclusive, but I guess not. Is there some straight-from-Google additional info to suport this? Or, am I just wrong to focus on user experience... bounce rate, pages, time on site, etc? Thanks! Mike
Algorithm Updates | | 945010