Accidently blocked our site for an evening?
-
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says:
Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success)When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.
-
If you have any sort of caching installed, you could try refreshing it and resubmitting the sitemap.
I checked your robots.txt file at http://tool.motoricerca.info/robots-checker.phtml and it flagged the allow line. I don't think that would cause a problem, but you could try removing the "Allow: /" line and see if that helps.
-
Hey Nick thanks for your response...i did the first part but the sitemap on resubmit of the sitemap.xmp it wont take due to this error
URL restricted by robots.txt
but my sitemap file is here http://posnation.com/sitemap.xml
and is not blocking it...any ideas on what to do next
-
No you're ok. It used to be that if your site went down for even a few hours and the spiders came around that you could get deindexed. Now I guess they understand that stuff happens and you have a pretty long grace period before you get deindexed thankfully.
Good suggestions by Nick, also you can increase the googlebot crawl rate on your site in GWMT to get Google to come around again quicker.
-
If it was just blocked overnight you should be OK. Site's do go down for extended periods of time occasionally and I would assume Google won't de-index based on a relatively short outage.
To be safe, or at least make yourself feel like you have done what you can - resubmit your xml sitemap in webmaster tools. Also go to the "Fetch as GoogleBot" section and fetch your home page. Once it is fetched, click on the submit link and tell it to submit the page and all linked pages. You are probably OK without doing that, but it couldn't hurt to resubmit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Confused about PageSpeed Insights vs Site Load for SEO Benefit?
I was comparing sites with a friend of mine, and I have a higher PageSpeed Insights score for mobile and desktop than he does, but he his google analytics has his page load speed higher than. So assuming all things equal, some quality of conent, links, etc, is it better to have a site with a higher PageSpeed score or faster site load? To me, it makes more sense for it to be the latter, but if that's true, what's the point of the PageSpeed insights? Thanks for your help! I appreciate it. Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Why does my site dissappeare from the top 50?
Hellow I am having some problems with my site www.kondomanija.si. It was ranked on the first page for my main KW kondomi (in www.google.si, Slovenia) but now it is not in the top 10 pages. And this has happened before, it drops out of the top 10 pages and in a cople of moths it is back for a short time (till it drops out again). It think the site has a week link profile... Could this be the reason? Does anybody know what is going on?
Algorithm Updates | | Spletnafuzija0 -
Does anyone know if Google ranks a responsive site, or a specific mobile site higher than each other?
I have heard that Google favors specific .m sites overs responsive designs in it's rankings. Does anyone know if this is true? And, if there is any supporting information. I have been in contact with our account team at Google but haven't had a response on this as yet. I appreciate any help on this. Cheers!
Algorithm Updates | | Fasthosts0 -
How to optimise a news site? - tomorrows chip paper terms
Are there any specific tips to how to gain traffic from very short lived search terms? If the site you are SEO/SEMing want to go for search related to things like the latest celebrity breakup, or a fashion event that lasts less than a week The onsite stuff seems pretty good as SEO onsite tools generally give it an A grade Is it just a case of doing the same stuff as normal, but faster? 😉
Algorithm Updates | | Fammy0 -
Google seems to have penalised one section of our site? Is that possible?
We have a page rank 5 website and we launched a new site 6 months ago in February. Initially we had horrible urls with a bunch of numbers and stuff and we since changed them to lovely human readable urls. This had an excellent effect across the site except on one section of the site: http://www.allaboutcareers.com/careers/graduate-employers Although Google has indexed these pages and several have a PR 2 they do not appear in Google when previously they were on page 1 when we had the old urls. We figured we just needed some time for Google to get used to it, but it hasn't done anything. It is also worth mentioning we changed the page titles from: FIRM NAME | DOMAIN NAME then... FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships | DOMAIN NAME then.. FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships Do you think these are being penalised? There are two types of page: Example A: http://www.allaboutcareers.com/careers/graduates/addleshaw-goddard.htm Example B: http://www.allaboutcareers.com/careers/graduates/accenture.htm
Algorithm Updates | | jack860 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0