Will putting a one page site up for all other countries stop Googlebot from crawling my UK website?
-
I have a client that only wants UK users to be able to purchase from the UK site.
Currently, there are customers from the US and other countries purchasing from the UK site.
They want to have a single webpage that is displayed to users trying to access the UK site that are outside the UK.
This is fine but what impact would this have on Google bots trying to crawl the UK website?
I have scoured the web for an answer but can't find one. Any help will be greatly appreciated. Thanks
-
Thank you Kindly
-
If you're an SEO / marketing person, the likelihood is that you'll need a developers help. They should know what you mean when you ask to exempt a certain user-agent (Googlebot) from the redirects
-
Is there some guidance on how to do this? I have looked on the web but am struggling to find anything. Thanks
-
No impact if you exempt Google's user agent ('googlebot') from the redirects. That way Google can crawl properly without it turning into a fiasco (which it will if you don't exempt them, as they sometimes crawl from different data centres - but only one at once - which does have a fixed location)
If you fail to exempt googlebot then they will only see the page which is rendered to the location of the data centre they are crawling from, so they will think your site keeps going up and down, one page then many pages (which will cause a huge mess)
Remember: exempt Google from the redirects or things will get messy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG Regarding: https://etcnaija.com
Technical SEO | | etcna0 -
One page of the site disappeared from serp for a month now
Im working on a clients site and been promoting a specific page to a keyword. started to move up the ranks and exactly a month ago on the 19/5 ( on the same day of the last update) updated the main page im working on with new content and published some other new pages on related subjects that all are linking to the main page im working on ( without the same anchor text in the links ) on the same day i found out that because of a technical error the new content was published on 5 other pages of the site and obviously created a duplicate content issue and i removed all the duplicates on the same day , i assume G caught this thing and punished the site for the duplicate content issue but : when i search the page directly with site:...i can find it. its been a month since i fixed all issues that i thought could impact the page..no duplicate content on the site. no KW stuffing. no spammy links to the page. everything seems fine now my question : why is my page not showing ? how long should i wait before giving up and creating a new page .? how come my site has not lost any organic traffic ( apart from that specific page ) ? is it possible to penalize only one page ? can i recover from this at all ? thanks
Technical SEO | | nira0 -
Best Practice for Blocking a site from 1 countries search engines
A client cannot appear in any search engines in one given country but they are ok in rest of the world. Has anybody had any experience blocking a site from appearing in just google.de, bing.de and yahoo.de for example?
Technical SEO | | Salience_Search_Marketing0 -
Why is an error page showing when searching our website using Google "site:" search function?
When I search our company website using the Google site search function "site:jwsuretybonds.com", a 400 Bad Request page is at the top of the listed pages. I had someone else at our company do the same site search and the 400 Bad Request did not appear. Is there a reason this is happening, and are there any ramifications to it?
Technical SEO | | TheDude0 -
Are there any tools that will detect when new pages are added?
need help with keeping track of when new pages pop up that might compete internally with already optimized pages.
Technical SEO | | SEOmoxy0 -
When is the last time Google crawled my site
How do I tell the last time Google crawled my site. I found out it is not the "Cache" which I had thought it was.
Technical SEO | | digitalops0 -
How do you diagnose if on your site is only 50% crawled?
Good Morning from 7 degrees C, goodbye arctic conditions wetherby UK, If a site had 100 pages for example & that site was plugged into Webmaster Tools how could you diagnose if all the pages had been crawled? The thing is I want to learn how to diagnose crawl issues with sites, is their a known methodology for this? Thanks in advance, David
Technical SEO | | Nightwing0 -
How far into a page will a spider crawl to look for text?
How far into a page will a spider crawl to look for text? I've heard a spider will only crawl the first 3kb, but can't find an authoritative source for that information.
Technical SEO | | crvw0