Is there an issue if we show our old mobile site to Google & new site to users
-
Hi,
We have our existing mobile site that contains interlinking in footer & content and new mobile site that does not have interlinking. We will show existing mobile site to google crawler & new mobile site to users. Will this be taken as black hat by Google.
The mobile site & desktop site will have same url across devices & browsers.
Regards
-
If you serve a different content to the crawler, that's cloaking and you will face the wrath of google (de-indexing).
If you serve different content based on user-agent/screen-width etc... It should not de-index you for cloaking.
Google has very precise instructions on what a site owner should do in your case, and serving different html and css based on user-agent is perfectly fine, as long as you follow their instructions on how to supply google crawler additional information through http headers.
-
Hi Guys,
Thanks for your valuable input however our mobile sites(both old & new mobile site) & desktop site will have same url across devices & browsers. The only difference is of content & design on old and new mobile site. Will this still be considered as black hat.
Regards
-
Based on personal experience I can confirm what Laura said, google crawler is extremely skilled at recognizing cloaking and it's very aggressive at penalizing websites which does cloak content.
-
If you have a new mobile site, best practice is to 301 redirect the old site to the new site. If you do that, the value of your links will pass to the new mobile site which will help you maintain your rankings. Laura is correct, what you are proposing is black hat and very risky.
-
This is definitely a violation of Google Webmaster Guidelines, and I would certainly consider it a black hat tactic. You risk doing more harm than good.
This is what Google has to say about it here:
It's a violation of Google Webmaster Guidelines to redirect a user to a different page with the intent to display content other than what was made available to the search engine crawler.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Core Web Vitals hit Mobile Rankings
Hey all, Ever since Google announced "Core Web Vitals" are mobile rankings have nose-dived. At first, I thought it was optimisation changes to the page titles we had made which might still be part of the issue. However, Desktop rankings actuallyy increased for the same pages where mobile decreased. There is the plan to introduce a new ranking signal into the Google algorithm called the "core web vitals: and this was discussed around late May. even though it's supposed to get fully indexed into a ranking signal later this year or early next; I think Google continuously test and release this items before any official release. If you weren't aware, there is a section in Google Webmaster Tools related to "core web visits", which looks at:1. Loading2. Interactivity3. Visual StabilityThis overlays some of the other basic requirements of a good website and mobile experience. Taking a look at our Google Search Console, it appears to be the following:1. Mobile- 1,006 poor URLs, 100URLs need improvement and 475 good URLs.2. desktop- 0 poor URLs, 379 need improvements and 1,200 good URLsSOURCE: https://search.google.com/search-console/core-web-vitals?resource_id=https%3A%2F%2Fwww.griffith.ie%2FIn the report, we can see two distinct issues with the mobile pages:CLS Issue: more than 0.25 (mobile)- 1,006 casesLCP issue: longer than 4secs (mobile) - 348 case_CLS (Cumulative Layout Shift)This is a developer issue, and needs fixing. It's basically when a mobile screen jumps for the user. It is explained in this article: https://web.dev/cls/Seems to be an issue with all pages. **LCP (Largest Contentful Paint)_**Again, another developer fix that needs to be implemented. It's connected to page speed, and can be viewed here: https://web.dev/lcp/Looking at GCS, it looks like the blog content is mostly to blame.It's worth fixing these issues and again looking at the other items on page speed score tests:1. Leverage browser caching- https://gtmetrix.com/reports/griffith.ie/rBtvUC0F2. https://developers.google.com/speed/pagespeed/insights/?url=griffith.ie- mobile score for home page is 16/100, https://www.griffith.ie/people/thamil-venthan-ananthavinayagan is 15/100I think here is the biggest indicator of the issue at hand. Has anybody else noticed their mobile rankings go down and desktop stay the same of increase.Kind regards,
Web Design | | robhough909
Rob0 -
Old Speed Reports
Hey all,
Web Design | | SEOSponge
We just moved our website to WordPress and trying to compare the new reports we are receiving for page speed/user experience to the old ones. Looking at dates before the relaunch, it looks as if old site speed reports are unavailable? Does anyone know of a way to recover older reports? We want to compare the responsive site/new desktop version to the old one (end users would have to "pinch" the screen to look at content on old site). Thanks in advance if you have any info!!!!0 -
Site Doing Horrible After Redesign
Hello Fellow Forum Members: Thank you all for taking the time to read this. This is in follow up to one of my previous questions, but I now have more information. I will try to be as concise as possible and want to sincerely thank anybody who invests time in answering this. Around February 9, 2013, we launched our new site on the Bigcommerce platform. We moved from Volusion after 6 years. We had paid the Bigcommerce partner for an upgraded 301 redirect package as I was thoroughly concerned about losing rankings. By the end of February our rankings were diminishing. We expected a slight dip due to the new site. As of May, our organic traffic had dropped by 82%. Google WMT is showing 1500+ 404 errors. Many have to do with review page type URLs and some were just plain never redirected apparently. In May, we hired a wonderful SEO company that is a heavy contributor to the Moz community. They have been generous and wonderful to work with. By the end of this last week it was determined that most of the coding suggestions our SEO was making could NOT be implemented in Bigcommerce because Bigcommerce will not allow access to the PHP files by our developer, thus hindering the execution of these suggestions. Some of these were move the blog to the root, use canonical on the home page, use canonical for pagination, stop the indexing of https URLs and a few more. Today, June 25 we are at a complete loss and trying to just keep our business alive. The opinion of both the SEO and the developer is that my choice of Bigcommerce as a platform was not the best. So my main question is what are the odds our rankings have decreased due to the lack of 301 redirects during our migration to Bigcommerce versus the rankings decreasing do to Bigcommerce being a bad choice as a platform? We are being advised to redevelop our entire site on an Open Source platorm such as Wordpress or Magento, but if that's not needed I certainly don't want to have to do that. I hope I have provided a decent amount of history and information. Thank you for any help/advice you are willing to offer.
Web Design | | josh3300 -
Image sliders & site speed
We are having a new website designed using WordPress and the Genesis framework. We wanted to include header image sliders on a number of internal site pages, but our designer says that sliders on more than just the home page will slow down the site significantly. How much could they slow down the site, and what can be done to minimize their effect on site speed?
Web Design | | GordyH0 -
Managing international sites
Hi all, I am trying to figure out the best way to manage our international sites. We have two locations, 1 in the UK and 1 in the USA. I currently use GEOIP to identify the location of the browser and redirect them using a cookie to index.php?country=uk or index.php?country=usa. Once the cookie is set I use a 301 redirect to send them to index.php, so that Google doesnt see each url as duplicate content, which Webmaster tools was complaining about. This has been working wonderfully for about a year. It means I have a single php language include file and depending on the browser location I will display $ or £ and change the odd ise to ize, etc. Problem I am starting to notice is that we are starting to rank better and better in the USA search result. I am guessing this is because the crawlers must be based out of the USA. This is great, but my concern is that I am losing rank in the UK, which is currently where most of our business is done out of... So I have done my research and because I have a .net will go for a /uk/ or /us/ sub folder and create two separate webmaster tools site and set them up to target each geographic location. Is this okay? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#2 HERE IS THE PROBLEM: I don't was to have to run two separate website with two separate sets of copy. Also, I dont want to lose all the rank data on urls like: http://www.mysite.net/great-rank-result.html now becomes http://www.mysite.net/uk/great-rank-result.html. On top of this I will have two pages, the one just mentioned and now adding http://www.mysite.net/us/great-rank-result.html, which I presume would be seen as duplicate copy? (Y/n) Can I use rel canonical to overcome this? How can I don't this without actually running the two pages. Could you actually have 1 site in the root folder and just use the same GEOIP techology to do a smart MOD REWRITE adding either UK or US to the url therefore being able to create two webmaster accounts targeting each geographic location? Any advise is most welcome.
Web Design | | Mediatomcat0 -
Old school HTML and rankings
How does really old school HTML (with inline CSS and a boat load of markup errors) affect modern SEO? I'm talking purely rankings, not conversions or bounce rate etc.
Web Design | | DavidWilsonSEO0 -
Development site accidentally crawled - Will this cause problems?
We are currently developing a new version of our website and to make it easy to access for all team members, we just set it up on a server accessible via a publicly accessible domain name (ie devsite.com). There has been no SEO and no links created to this site, or so I thought. Recently, I found out that Google somehow found its way to this development site and has been indexing the pages! I was a little alarmed, as there are no links to the domain and we'll soon be transitioning all the content over to our primary production domain. I immediately created a robots.txt file to disallow access to the entire development domain. My fear is that there may be some duplicate content penalty if Google sees that the content that is on our new site (once it goes live and is pushed to our REAL domain name) was previously indexed on our test domain. We're slated to launch in 2-3 weeks. Is there anything else I should do? Should I even be worried? I'm probably a bit paranoid, but given the amount of time and effort that has gone into this new site, I love any advice or thoughts. Thank You!
Web Design | | AndrewY0 -
Old SEO keyword "articles", are they hurting rankings?
Hello, About two years ago, the company I work for hired an SEO firm to improve organic rankings on our site. The SEO company's primary method for doing this was producing "articles" that are not really articles but keyword stuffed pages with lots of hidden, internal links to other legitimate pages on our site. Examples: http://www.creamright.com/Isi-Chargers-articles.html http://www.creamright.com/How-To-Make-Whipped-Cream-article.html http://www.creamright.com/Cream-Whipper-articles.html Obviously, this strategy wasn't greatly successful and we cancelled our work with the firm. However, we still have all of the "articles" on the site (about 50-60 pages total) and each page is navigable from the html and XML sitemaps. Additionally, the SEO firm we used built a lot of useless links to these pages from BS directory sites which are all still active. The question I have is whether we should remove these "article" pages or should leave them alone? Although I'm sure they aren't helping any of our SEO efforts, could deleting the pages after two years negatively impact our search rankings? Thanks in advance for any help on this, Doug M.
Web Design | | Loganshark1