Disappeared from Google with in 2 hours of webmaster tools error
-
Hey Guys
I'm trying not to panic but....we had a problem with google indexing some of our secure pages then hit those pages and browsers firing up security warning, so I asked our web dev to have at look at it
he made the below changes and within 2 hours the site has drop off the face of google
“in web master tools I asked it to remove any https://freestylextreme.com URLs” “I cancelled that before it was processed”
“I then setup the robots.txt to respond with a disallow all if the request was for an https URL” “I've now removed robots.txt completely” “and resubmitted the main site from web master tools”
I've read a couple of blog posts and all say to remain clam , test the fetch bot on webmasters tools which is all good and just wait for google to reindex
do you guys have any further advice ?
Ben
-
Thanks for the responses guys , it was picked back up in around 4 hours and lost no rank thankfully orders crashed but are back to normal now ! I'm going to investigate the two versions of the site is a bit strange
again thanks for your help
-
Hi Ben,
It's now 2 days after your original question, and it looks like your back in the SERPs, at least from what I can tell. Hopefully you've made a full recovery.
It's difficult to understand exactly what damage was done by your dev in Webmaster Tools, but it's reasonable to assume that whatever it was caused the error.
One thing I did notice is that both the https and http version of your site resolve.
http://www.freestylextreme.com/ &
https://www.freestylextreme.com/Ideally, one would redirect to the other, or at a minimum have a rel canonical tag in place so that only one version is crawled.
I'd go ahead a put your robots.txt file back in place, and check yourself with Google webmaster tools to make sure everything is okay.
Best of luck!
-
Hi Ben
I understand the panic in such a situation. I truly do. I checked your website and the fact that you do not have a robots.txt. You also do not have any kind of noindex or anything like that in your code and you look like a strong established website.
Do you have an XML sitemap in your webmaster console ? I would suggest you build one and submit it if you don't already have it.
Other then that, I would also suggest to have a robots.txt even if it's blank. Rather then a 404 redirecting to your homepage. Give me 24-48hrs and in my opinion, you should be back.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you have 2 different websites on 1 webmaster tools account
Someone set up both our sites on the one webmaster tools account is this the best way to do it or should we have 2 different accounts. We are having problems with our site verification not working and our google shopping feeds not working could this be the cause.
Technical SEO | | CostumeD0 -
Webmaster Tools Verification Problem
Hello, Somehow a website I'm working on has lost it's verification in Webmaster Tools. I have absolutely no clue why... The Google Analytics code is still working, the verification meta tag is in place, both are not working. I get an error message about Google not being able to connect to the server. I asked about any possible changes to server settings or stuff about that, but apparently nothing has changed there. The URL in question is Bivolino.com Does someone has any other ideas what I could be looking for. Thanks, Kind regards, Erik
Technical SEO | | buiserik0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
Google Still Taking 2 - 3 Days to Index New Posts
My website is fairly new, it was launched about 3.5 months ago. I've been publishing new content daily (except on weekends) since then. Is there any reason why new posts don't get indexed faster? All of my posts gets +1's and they're shared on G+, FB and Twitter. My website's at www.webhostinghero.com
Technical SEO | | sbrault740 -
Webmaster tools crawl stats
Hi I have a clients site that was having aprox 30 - 50 pages crawled regularly since site launch up until end of Jan. On the 21st Jan the crawled pages dropped significantly from this average to about 11 - 20 pages per day. This also coincided with a massive rankings drop on the 22nd which i thought was something to do with panda although it later turned out the hosts had changed the DNS and exactly a week after fixing it the rankings returned so i think that was the cause not panda. However i note that the crawl rate still hasn't returned to what it was/previous average and is still following the new average of 10-20 pages per day rather than the 30-50 pages per day. Does anyone have any ideas why this is ? I have since added a site map but hasnt increased crawl rate since A bit of further info if it helps in any way is that In the indexed status section says 48 pages ever crawled with 37 pages indexed. There are 48 pages on the site. The site map section says 37 submitted with 35 indexed. I would have thought that since dynamic site map would submit all urls Any clarity re the above much appreciated ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Google webmaster tools says access denied for 77 urls
Hi i am looking in google webmaster tools and i have seen a major problem which i hope people can help me sort out. The problem is, i am being told that 77 urls are being denied access. The message when i look for more information says the below Googlebot couldn't crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site. the responce code is 403 here is a couple of examples http://www.in2town.co.uk/Entertainment-Magazine http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone i think the problem could be that i have sent them to another url in my httaccess file using the 403 re-direct but why would it bring up that google bot could not crawl them any help would be great
Technical SEO | | ClaireH-1848860 -
What if 404 Error not possible?
Hi Everyone, I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error. It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon. I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired). Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough? Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page? Many thanks in advance for your help, Best Regards, Daniel
Technical SEO | | te_c0 -
Help with Webmaster Tools "Not Followed" Errors
I have been doing a bunch of 301 redirects on my site to address 404 pages and in each case I check the redirect to make sure it works. I have also been using tools like Xenu to make sure that I'm not linking to 404 or 301 content from my site. However on Friday I started getting "Not Followed" errors in GWT. When I check the URL that they tell me provided the error it seems to redirect correctly. One example is this... http://www.mybinding.com/.sc/ms/dd/ee/48738/Astrobrights-Pulsar-Pink-10-x-13-65lb-Cover-50pk I tried a redirect tracer and it reports the redirect correctly. Fetch as googlebot returns the correct page. Fetch as bing bot in the new bing webmaster tools shows that it redirects to the correct page but there is a small note that says "Status: Redirection limit reached". I see this on all of the redirects that I check in the bing webmaster portal. Do I have something misconfigured. Can anyone give me a hint on how to troubleshoot this type of issue. Thanks, Jeff
Technical SEO | | mybinding10