Should I fetch in WMT with all 4 options?
-
When we ask google to Fetch a page, I usually just do the desktop one. However, should I be using the other 3 options as well? Mobile Smartphone, Mobile xHTML, and Mobile cHTML? I guess since they give you the options, just doing desktop means that it won't go to mobile until a regular crawl, but I just want to make sure that is the case.
Thanks,
Ruben
-
All the options for fetch in Google search console has a different purposes
Desktop - This is default and this option is for the website which can crawl images, videos, webpages and all that.
Mobile smartphone - This option is for Google smartphone crawlers.
Mobile xHTML - This option does not support rendering and Uses the SAMSUNG XHTML/WML crawler.
Mobile cHTML - This option is mostly for the Japanese feature phones which Uses the DoCoMo Google Mobile crawler and this option also does not support rendering.
Source here
-
The case is when you have special website that serve different versions of site based on user agent. So in this case you want to see what GoogleBot could retrieve from your site. This also can assist you in mobile redirects too also based on user agents.
In only one way where can't help you - it's responsive web because there bots see only one version of HTML.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fetch as Google temporarily lifting a penalty?
Hi, I was wondering if anyone has seen this behaviour before? I haven't! We have around 20 sites and each one has lost all of its rankings (not in index at all) since the medic update apart from specifying a location on the end of a keyword. I set to work trying to identify a common issue on each site, and began by improving speed issues in insights. On one site I realised that after I had improved the speed score and then clicked "fetch as google" the rankings for that site all returned within seconds. I did the same for a different site and exactly the same result. Cue me jumping around the office in delight! The pressure is off, people's jobs are safe, have a cup of tea and relax. Unfortunately this relief only lasted between 6-12 hours and then the rankings go again. To me it seems like what is happening is that the sites are all suffering from some kind of on page penalty which is lifted until the page can be assessed again and when it is the penalty is reapplied. Not one to give up I set about methodically making changes until I found the issue. So far I have completely rewritten a site, reduced over use of keywords, added over 2000 words to homepage. Clicked fetch as google and the site came back - for 6 hours..... So then I gave the site a completely fresh redesign and again clicked fetch as google, and same result. Since doing all that, I have swapped over to https, 301 redirected etc and now the site is completely gone and won't come back after fetching as google. Uh! So before I dig myself even deeper, has anyone any ideas? Thanks.
Technical SEO | | semcheck11 -
Why are these URL's suddenly appearing in WMT?
One of our clients has suddenly experienced a sudden increase in crawl errors for smart phones overnight for pages which no longer exist and there are no links to these pages according to Google. There is no evidence as to why Google would suddenly start to crawl these pages as they have not existed for over 5 years, but it does come after a new site design has been put live. Pages do not appear to be in the index when a site search is used. There was a similar increase in crawl errors on desktop initially after the new site went live, but these quickly returned to normal. Mobile crawl errors only became apparent after this. There are some URL's showing which have no linking page detected so we don't know where these URL's are being found. WMT states "Googlebot couldn't crawl this URL because it points to a non-existent page". Those that do have a linking page are showing an internal page which also doesn't exist so it can't possibly link to any page. Any insight is appreciated. Andy and Mark at Click Consult.
Technical SEO | | ClickConsult0 -
WMT Change of address when moving to subdir
A client is moving their site that has had its own domain for years to their main domain under a subdir. So, product.com is going to brand.com/product with all pages and paths staying the same, no content is being removed. We have 301 the old domain to brand.com/product however, should we also submit a change of address in Google's WMT? The reason I ask is because the site wont be located at brand.com but at brand.com/product and do not want to confuse Google, as the 301 redirects will not match the change of address exactly.
Technical SEO | | piperis0 -
Flatlined traffic starting between April 29 and May 4 2013
One of our clients, an ecommerce shop, has seen a significant drop in their organic traffic and I'm trying to determine whether this was the result of a data refresh or algo update. The site has few inbound links and unfortunately still has a lot of duplicate content on it (manufacturer provided product descriptions). There are also some remaining issues of duplicate page titles that we've been working through. The client has also been writing blogs recently, however there are a number which are relatively short in length. Does anyone have a suggestion as to how I can start recovering from this?
Technical SEO | | bobbygsy0 -
Google WMT continues reporting fixed 404s - why?
I work with a news site that had a heavy restructuring last spring. This involved removing many pages that were duplicates, tags, etc. Since then, we have taken very careful steps to remove all links coming into these deleted pages, but for some reason, WMT continues to report them. By last August, we had cleared over 10k 404s to our site, but this lasted only for about 2 months and they started coming back. The "linked from" gives no data, and other crawlers like seomoz aren't detecting any of these errors. The pages aren't in the sitemap and I've confirmed that they're not really being linked from from anywhere. Why do these pages keep coming back? Should I even bother removing them over and over again? Thanks -Juanita
Technical SEO | | VoxxiVoxxi0 -
Unnatural Link Warning Removed - WMT's
Hi, just a quick one. We had an unnatural link warning for one of our test sites, the message appeared on the WMT's dashboard. The message is no longer there, has it simply expired or could this mean that Google no longer sees an unatural backlink profile? Hoping it's the latter but doubtful as we haven't tried to remove any links.. as I say it's just a test site. Thanks in advance!
Technical SEO | | Webpresence0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Adjust the priority field under the XML sitemap option
For those familiar with this in Drupal - is this worth doing? It seems to be a setting that affects the priority of a URL compared to others on the site. It's set to a default of 0.5 but you can increase up to 1.0 I think. Anyone know about this? thanks
Technical SEO | | inhouseninja0