Best blocking solution for Google
-
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
-
Following up here -- did this answer Dave's question?
-
I would put noindex,follow on those page and wait a little until they disappear for Google index. Of course, if you have only a few pages, I would do it manually in GWT. If you have rather big site with a good crawl rate, this should be done in a few days.
When you don't see them anymore, you may use DISALLOW */beerbottles/ but this could be annoying later. I would recommend to use the meta robots as you have more control on it. It will allow page rank to flow in the beerbottles pages too !
-
I believe you can confirm the block via the webmaster tools also.
-
Hi Goodnewscowboy,
To block the whole folder you dont need to use the wild card (*)
and I advise you to also do these steps:
- Verify your ownership of the site in Webmaster Tools.
- On the Webmaster Tools home page, click the site you want.
- On the Dashboard, click Site configuration in the left-hand navigation.
- Click Crawler access, and then click Remove URL.
- Click New removal request.
- Type the URL of the page you want removed, and then click Continue. Note that the URL is case-sensitive—you will need to submit the URL using exactly the same characters and the same capitalization that the site uses.
- Select Remove page from cache only.
- Select the checkbox to confirm that you have completed the requirements listed in this article, and then clickSubmit Request.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from a Google penalty
Hi there, So about 3.5 weeks ago I noticed my website (www.authenticstyle.co.uk) had gone from ranking in second place for our main key phrase "web design dorset" to totally dropping off the SERP's for that particular search phrase - it's literally no where to be seen. It seems that other pages of my website still rank, but the homepage. I then noticed that I had an unread alert in my Google Search Console account to say that a staging site we were hosting on a subdomain (the subdomain was domvs.authenticstyle.co.uk) had hacked content - it was a couple of PDF files with weird file names. The strange thing is we'd taken this staging site down a few weeks earlier, BUT one of my staff had left an A record set up in our Cloudflare account pointing to that staging server - they'd forgotten to remove it when removing the staging site. I then removed the A record, myself and submitted a reconsideration request on Google Search Console (which I still haven't received confirmation of) in the hope of everything sorting itself out. Since then I've also grabbed a Moz Pro account to try and dig a little deeper, but without any success. We have a few warnings for old 404's, some missing meta descs on some pages, and some backlinks that have accumulated over time that have hghish spam rating, but nothing major - nothing that would warrant a penalty as far as I can tell. From what I can make out, we've been issued a penalty on our homepage only, but I don't understand why we would get penalised for hacked content if that site domvs.authenticstyle.co.uk no longer existed (would it just be due to that erroneous A record we forgot to remove?). I contacted a few freelance SEO experts and one came back to me saying I'd done everything correctly and that I should see our site appearing again in a few days after submitting the reconsideration request. Its been 3 weeks and nothing. I'm at a huge loss as to how my site can recover from this. What would you recommend? I even tried getting our homepage to rank for a variation of "web design dorset", but it seems our homepage has been penalised for anything with "dorset" in the keyphrase. Any pointers would be HUGELY appreciated. Thanks in advance! Will
Technical SEO | | wsmith7270 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
How to rank in Google Places
Normally, I don't have a problem with local SEO (more of a multi-channel sort of online marketing guy) but this one has got me scratching my head. Look at https://www.google.co.uk/search?q=wedding+venues+in+essex Theres two websites there (fennes and quendon park) that both have a much more powerful DA but don't appear in the Google Places (Google + Business or whatever it's labeled as). Why are websites such as Boreham house ranking top in the map listings? Quendon Park has a Google places listing, it's full of content, the NAP all matches up. Its a stronger website. Boreham House isn't any closer to the centroid than Quendon Park Just got me struggling this one
Technical SEO | | jasonwdexter0 -
Which one is the best
Dear Seo experts, 1,5 month ago i started a informative website, i started it with a blank registrated domainname. Now 1 month further I've stacked the website with content and did much linkbuilding. Yesterday i ve bought a domainname from quarantine, its a domainname around 6 years old and has a bunch of backlinks already. What to do next? The first one has good content and good recent linkbuilding done. The second is a better domainname and is old and has old backlinks. And also higher PA and DA then the first one. Should i now go for the first one and 301 redirect the old domainname to the new one. Or should I do it the opposite way, 301 redirect the new website to the old domainname and move all content to the old domainname and try to move all linkbuilding to older domain? Hopefully anyone could give me a great answere, thank you so much! Kind regards, Menno
Technical SEO | | MennoO0 -
How to correct a google canonical issue?
So when I initially launched my website I had an issue where I didn't properly set my canonical tags and all my pages got crawled. Now in looking at the search engine results I see a number of the pages that were meant to be canonical tagged to the correct page showing up in the results. What is the best way to correct this issue with google? Also I noticed that while initially I was ranking well for the main pages, now those results have disappeared entirely and deeper in the rankings I am finding the pages that were meant to be canonical tagged. Please Help.
Technical SEO | | jackaveli0 -
Google Custom Site Search
I am an admin on a google custom site search account. I am also the owner of a verified webmaster tools account for the same site. The Custom Search control panel will not let me add URL's or a Site map for on demand indexing, but says "you must submit a sitemap of your own verified sites". Has anyone else has this issue? Does the Owner of the custom search account have to be the owner of the webmaster account, or can the logged in admin be? Thanks
Technical SEO | | SEMPassion0 -
How best to set up Google + business pages for clients
I wish to setup a business page on google+ business page for my clients but it requires a personal profile, my clients don't want a personal profile but do want the business page. Currently i have set them up with pages on my personal profile but do can i allow the client to manage it? so i am not sure this is the best way Whats the best way for web developers to setup Google+ accounts for clients?
Technical SEO | | Bristolweb1 -
How to disallow google and roger?
Hey Guys and girls, i have a question, i want to disallow all robots from accessing a certain root link: Get rid of bots User-agent: * Disallow: /index.php?_a=login&redir=/index.php?_a=tellafriend%26productId=* Will this make the bots not to access any web link that has the prefix you see before the asterisk? And at least google and roger will get away by reading "user-agent: *"? I know this isn't the standard proceedure but if it works for google and seomoz bot we are good.
Technical SEO | | iFix0