Removal tool - no option to choose mobile vs desktop. Why?
-
Google's removal tool doesn't give a person the option to tell them which index - mobile friendly, or desktop/laptop - the url should be removed from. Why?
I may have a fundamental misunderstanding. The way I thought it works is that when you have a dynamically generated page based on the user agent, (ie, the SAME URL but different formatting for smartphones as for desktop/laptop) then the Google mobile bot will index the mobile friendly version and the desktop bot will index the desktop version -- so Google will have 2 different indexed results for the same url. That SEEMS to be validated by the existence of the words 'mobile-friendly' next to some of my mobile friendly page descriptions on mobile devices.
HOWEVER, if that's how it works--why would Google not allow a person to remove one of the urls and keep the other? Is it because Google thinks a mobile version of a website must have all of the identical pages as the desktop version? What if it doesnt? What if a website is designed so that some of the slower pages simply aren't given a mobile version? Is it possible that Google doesn't really save results for a mobile friendly page if there is a corresponding desktop page-- but only checks to see if it renders ok? That is, it keeps only one indexed copy of each url, and basically assumes the mobile title and actual content is the same and only the formatting is different? That assumption isn't always true -- mobile devices lend themselves to different interactions with the user - but it certainly could save Google billions of dollars in storage.
Thoughts?
-
Thanks for your reply, but the link you pointed me to isn't my situation. I'm not redirecting to separate urls. Mine is this: https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving
I HAVE to figure out what Google is doing in my situation because if I don't and I assume wrong, then lots of pages for either my desktop or mobile friendly won't be indexed.
Surely lots of website owners have had their developers create a minimal mobile friendly site with less content and pages than desktop users get and chosen the dynamic-serving approach, but I have yet to receive a reply from anyone who has faced that issue..It's a very serious issue for me because either I have to consider dumping the dynamic serving in favor of separate mobile urls (if that would work), or I have to do a ton of programming to add in content so that all the urls have both mobile and desktop content.
-
**There's a lengthy discussion on it here: **https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/separate-urls?hl=en
Notably this section:
For Googlebot, we do not have any preference and recommend that webmasters consider their users when deciding on their redirection policy. The most important thing is to serve correct and consistent redirects, i.e. redirect to the equivalent content on the desktop or mobile site. If your configuration is wrong, some users may not be able to see your content at all.
In my opinion, trying to figure out what Google does on the backend is a losing proposition. Maybe they index both; maybe they index one. Heck, there's no way to know if they even index full text now for every site. There's certainly a lot of optimization going on in the back-end that is above and beyond our purview as SEO practitioners.
Google says they don't care what kind of redirect you use for mobile. Likely, that means your mobile sites are being semantically linked to your desktop version of the pages -- they specifically recommend against pointing two separate page redirects to the same mobile page. They recommend that you add a link that lets mobile users click over to desktop for usability. That's good enough for me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content From API - Remove or to Redirect ?
Hi Guys,
Intermediate & Advanced SEO | | PaddyM556
I am working on a site at the moment,
Previous developer used a API to pull in HealthCare content (HSE) .
So the API basically generates landing pages within the site, and generates the content.
To date it has over 2k in pages being generated.
Some actually rank organically and some don't. New site being launch: So a new site is being launched & the "health advice" where this content used to live be not included in the new site. So this content will not have a place to be displayed. My Query: Would you allow the old content die off in the migration process & just become 404's
Or
Would you 301 redirect the all or only ranking pages to the homepage ? Other considerations, site will be moved to https:// so site will be submitted to search console & re-indexed by Google. Would love to hear if anyone had similar situation or suggestions.
Best Regards
Pat0 -
Magento & Accelerated Mobile Pages
Hi Folks, With Google rolling out changes to AMP & webmasters being encouraged to implement AMP.
Intermediate & Advanced SEO | | Patrick_556
Has anyone had any experiences implementing AMP for Magento Ecommerce. I understand that AMP is primary for articles & blog posts, but assuming AMP could be implemented on Product Pages, they would load faster & offer a better user experience & a step in the right direction What do you guys think? Many Thanks,
Patrick0 -
Mobile Meta Descriptions?
Hi Guys, We have two different versions of each page for both desktop and mobile example: http://tinyurl.com/zkxlxax http://tinyurl.com/zelqcbv We want to create meta descriptions for both versions. However in the CMS (http://www.mantistech.com.au/ecommerce_website_package.aspx) it only allows one meta description. Example: http://s21.postimg.org/ar8bzrh3r/screenshot_1804.jpg Does anyone know anyway around this? To add two different meta descriptions and tell Google which one to use based on device type. Thanks.
Intermediate & Advanced SEO | | jayoliverwright0 -
Question about robots file on mobile devices
Hi We have a robots.txt file, but do I need to create a separate file for the m.site or can I just add the line into my normal robots file. Ive just read the Google Guidelines (what a great read it was) and couldn't find my answer. Thanks in Advance Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
Should i remove sitemap from the mainsite at a webshop (footer link) and only submit .XML in Webmaster tools?
Case: Webshop with over 2000 products. I want to make a logical sitemap for Google to follow. What is best practice at this field? Should i remove the on-page sitemap there is in html with links (is shown as a footer link called "sitemap") and only have the domain.com/sitemap.xml ? Links for great articles about making sitemaps are appreciated to. The system is Magento, if that changes anything.
Intermediate & Advanced SEO | | Mickelp0 -
Remove Landing Pages?
Howdy Guys, I've just been listening to the latest edition of whiteboard Friday regarding the over-optimization penalty. I'm just wondering if we should remove alot of make specifc landing pages... For instance we have a landing pages for our top 20 cars... For instance "bmw keyword" or "audi keyword" What do you guys think? remove them and 301 the pages to the homepage? Thanks, Scott
Intermediate & Advanced SEO | | ScottBaxterWW0 -
What to do with non-existing products (removed products)?
Hello, I'm selling unique products - only one of a kind of each product.
Intermediate & Advanced SEO | | BeytzNet
This means that whenever a product is sold, it is removed from display. In order not to upset Google by keep removing indexed pages I created a "sold items" page which links to all of the removed products. The problem is (or maybe it's not a problem) is that I got to the point where I have more "sold items" then existing items (and the list keeps adding up). What should I do with the non-existing items?
Was I correct? ---------------------------------------- ADDED INFO --------- The way the site is built is that I have main category pages and each of them is showing a large amount of products. Most of these products got indexed by Google. Each product has its own unique URL (Products do not return...) Once a product is sold it does not come up in the product categories - I only have a general "sold items" in the footer that shows all of them (with a lot of pagination). Since the products are rapidly changing, i thought it would upset Google to have a hundred 301 redirects in each week or two. Since the products are very similar to one another (only different measurements / colors etc.), I thought of having a link from a sold Item to a similar available item so if Google will direct someone it will probably be to the available product. The problem is that the sold items are now 4 times more than the number of available items... I don't think that a store should display 2008's t-shirts on 2012... Another problem that may rise with so many products is that I'm afraid that the one type of product that is being sold much more often will take charge at the end on the entire site since I will end up with 8,000 sold items of this product, 1000 sold items of other products and 1000 available misc products... this might also start causing duplication problems as the products are quite similar. Should I stop with the "Sold" products and use 301's? Thanks0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0