Google Webmaster Warning for Non-mobile Optimized Pages
-
I just received a warning in Webmaster Tools that my site pages are not optimized for mobile devices and the search results for pages will be decreased in mobile searches. Just got it yesterday and I see no drop yet, but anyone else seen this????
The notice states that the site is not optimized for viewport, text size and proper space for clickable elements. All of that is true since we have not yet completed our responsive design conversion.
Any idea if Google will give us a little time to get this resolved, or do my rankings start dropping right away?
Just when you think you are moving forward with Google, they pull the rug out again...
-
I just hope they give us some more time to get things resolved and roll this out slowly so that it is not like a major penalty.
-
Lots of great responses from the Moz community. Thank you. I got pretty concerned about this because we see alot of our organic traffic as coming from mobile. I'm taking this pretty seriously and have already hired a programmer to get the site fully responsive. My deadline for getting it completed is 10-14 days, so I hope that Google gives us some more time to implement changes.
At first I was a bit disheartened, but this is actually pushing me to get changes done that I have been procrastinating on, so maybe it is a good thing.
I just hope that the criteria Google uses for mobile compatibility is fairly loose, so it is obvious what needs to be fixed to be considered mobile friendly.
Best Regards,
-
We got the same warning here at Moz in WMT. While responsive isn't affecting rankings yet, it's coming and Google's letting us know ahead of time. Additionally, if your competitors aren't mobile-optimized, this may be a great way to get a leg up on them.
-
I got the same warning for one off our brand sites, but I don't really monitor the rankings at the moment as these are only really holding sites.
-
The shift is happening. I just did a review in Google Anayltics and noticed in 2010 we had 0.5% traffic from mobile devices today it is over 13%. This trend is likely not going to change. Google is trying to be pro-active here.
-
Hi Lawrence,
I agree with Ryan. As search is shifting more and more towards mobile, Google is starting to let webmasters know if their website is mobile friendly. I have noticed more warnings on some of our old websites and a few on our new. We are just going back to each one and making sure they are mobile friendly. The good news is that there are plenty of tools and resources to get this done pretty fast.
-
Hi Lawrence. You're actually ahead of the curve on this one--think of all the sites that aren't even setup with Google Webmaster Tools accounts--so you'll have some time before seeing a harsh change in the rankings. That said, MANY people are seeing this warning, and it is a stated position from Google that they want site owners to use more mobile friendly layouts. It sounds like you have the redesign in the works, so a little extra motivation to get it done soon... Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do when Google automatically replaces the page title.
Hello Friends, Might you all are aware of the scenario when Google auto generates the snippets for search results. But nowadays I am seeing some changes like google is showing some specific words in the last of search results title for every page of my website. It looks Google is treating those words as the brand name. I have tried many things to solve this but unfortunately, nothing works for this. Does anyone see the same changes? Can anybody help me out with this or suggest me the reasons behind this.
Technical SEO | | Shalusingh1 -
My website is currently failing Google's mobile friendly test. What are my options?
What can I tell my developer so I pass this test? What will they need to develop A web mockup? Is there an easy code to implement?
Technical SEO | | pmull0 -
Will Google still ignore the second instance of anchor text on a page if it has an H2 tag on it?
We have a page set up that has anchor text with header tags. There is an instance where the same anchor text is on the page twice linking to the same page, and I know that Google will ignore the second instance. But in the second instance it also had an H2 tag (which I removed and put it on the first instance of anchor text even though it's smaller). Is this good practice?
Technical SEO | | AliMac260 -
Duplicate Page Content for www and non-www. Help!
Hi guys, having a bit of a tough time here... MOZ is reporting duplicate content for 21 pages on eagleplumbing.co.nz, however the reported duplicate is the www version of the page. For example: http://eagleplumbing.co.nz and http://www.eagleplumbing.co.nz are considered duplicates (see screenshot attached) Currently in search console I have just updated the non-www version to be set as the preferred version (I changed this back and forth twice today because I am confused!!!). Does anyone know what the correct course of action should be in this case? Things I have considered doing include: changing the preferred version to the www version in webmaster tools, setting up 301 redirects using a wordpress plugin called Eggplant 301 redirects. I have been doing some really awesome content creation and have created some good quality citations, so I think this is only thing that is eaffecting my rank. Any help would be greatly appreciated. view?usp=sharing
Technical SEO | | QRate0 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
How long does it take for an article or a page to be listed by google
Hi, my question is a two parter. I think i must be doing something wrong. With my site map, it is set to show different section of my site while on my old site the site map listed every single article - i am not sure if setting it to each section is correct, can someone please advise me on this. The second part of the question is, how long does it take for an article to be listed by google. This article on my site was written today http://www.in2town.co.uk/lifestyle/holidaymakers-ignore-the-importance-of-travel-insurance-according-to-survey Holidaymakers Ignore The Importance of Travel Insurance According To Survey but when i check to see if google has listed the article yet by putting in the whole title, it does not come up, i even added the website name at the end and still it did not come up. This is worrying me a bit as a lot of my articles are news stories which means they are current articles so if google is not picking them up then no one else will be. can anyone let me know what i should be doing so google picks them up quicker please.
Technical SEO | | ClaireH-1848860 -
Descriptions missing from rankings associated with Google Place pages.
Can anyone help me figure out why my rankings that are associated with Google Place pages are missing descriptions? I have a number one result for the top searched keyword in my category but it just doesn't look the same without a description and I'm sure it's affecting CTR too.
Technical SEO | | glideagency0 -
Changed cms - google indexes old and new pages
Hello again, after posting below problem I have received this answer and changed sitemap name Still I receive many duplicate titles and metas as google still compares old urls to new ones and sees duplicate title and description.... we have redirectged all pages properly we have change sitemap name and new sitemap is listed in webmastertools - old sitemap includes ONLY new sitemap files.... When you deleted the old sitemap and created a new one, did you use the same sitemap xml filename? They will still try to crawl old URLs that were in your previous sitemap (even if they aren't listed in the new one) until they receive a 404 response from the original sitemap. If anone can give me an idea why after 3 month google still lists the old urls I'd be more than happy thanks a lot Hello, We have changed cms for our multiple language website and redirected all odl URl's properly to new cms which is working just fine.
Technical SEO | | Tit
Right after the first crawl almost 4 weeks ago we saw in google webmaster tool and SEO MOZ that google indexes for almost every singlepage the old URL as well and the new one and sends us for this duplicate metatags.
We deleted the old sitemap and uploaded the new and thought that google then will not index the old URL's anymore. But we still see a huge amount of duplicate metatags. Does anyone know what else we can do, so google doe snot index the old url's anymore but only the new ones? Thanks so much Michelle0