Robots blocked by pages webmasters tools
-
- a mistake made in software. How can I solve the problem quickly? help me.
-
Ok,
Updated.
I need to wait 1 week then
Thank You, Gaston
-
That sitemap line place it at the end of the file.
like this:User-agent: *
Disallow:Sitemap: https://site.com/sitemap.xml
Best luck!
-
-
its better to set it as I've discribed before:
User-agent: *
Disallow:It would take over a week, depending in scale of the website.
-
-
Hello there!
Have you identified all pages that you want to be blocked?
If you just do not want any page to be blocked, just edit you robots.txt as:User-agent: *
Disallow:For more information, check these resources:
Cómo bloquear URLs con robots.txt - Google Webmasters
About /robots.txt - robotstxt.orgBest Luck!
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
One page ranking for all key words, when other targeted pages not ranking
Hi everyone I am fairly new to SEO but have a basic understanding. I have a page that has a lot of content on it (including brand names and product types and relevant info) ranking for a quite a few key words. This is cool, except that I have pages dedicated to each specific key word that are not ranking. The more specific page still has a lot of relevant text on it too. eg. TYRES page - Ranks first for "Tyres". Ranks okay for many tyre key words, including "truck tyres"
Intermediate & Advanced SEO | | JDadd
TRUCK TYRES page - not ranking for "truck tyres" Further on, I then have pages not ranking all that well for more specific key words when they should. eg HONDA TRUCK TYRES - Then has a page full of product listings - no actual text. Not Ranking for "honda truck tyres". ABC HONDA TRUCK TYRE - not ranking for "abc honda truck tyre" key word
These pages don't have a lot of content on them, as essentially every single tyre is the same except for the name. But they do have text. So sometimes, these terms don't rank at all. And sometimes, the first TYRES page ranks for it. I have done the basic on page seo for all these pages (hopefully properly) including meta desc, meta titles, H1, H2, using key words in text, alt texting images where possible etc. According to MOZ they are optimised in the 90%. Link building is difficult as they are product listings, so other sites don't really link to these pages. Has anyone got ideas on why the top TYRES page might be so successful and out ranking more specific pages? Any ideas on how I can get the other pages ranking higher as they are more relevant to the search term? We are looking in to a website redesign/overhaul so any advice on how I can prevent this from happening on essentially a new site would be great too. Thanks!0 -
Google Webmaster Remove URL Tool
Hi All, To keep this example simple.
Intermediate & Advanced SEO | | Mark_Ch
You have a home page. The home page links to 4 pages (P1, P2, P3, P4). ** Home page**
P1 P2 P3 P4 You now use Google Webmaster removal tool to remove P4 webpage and cache instance. 24 hours later you check and see P4 has completely disappeared. You now remove the link from the home page pointing to P4. My Question
Does Google now see only pages P1, P2 & P3 and therefore allocate link juice at a rate of 33.33% each. Regards Mark0 -
Wordpress - Dynamic pages vs static pages
Hi, Our site has over 48,000 indexed links, with a good mix of pages, posts and dynamic pages. For the purposes of SEO and the recent talk of "fresh content" - would it be better to keep dynamic pages as they are or manually create static pages/ subpages. The one noticable downside with dynamic pages is that they arent picked up by any sitemap plugins, you need to manually create a separate sitemap just for these dynamic links. Any thoughts??
Intermediate & Advanced SEO | | danialniazi1 -
Why Would This Old Page Be Penalized?
Here's an old page on a trustworthy domain with no apparent negative SEO activity according to OSE and ahrefs: http://www.gptours.com/Monaco-Grand-Prix They went from page 1 to page 13 for "monaco grand prix" within about 4 weeks. Week 2 we pulled out all the duplicate content in the history section. When rank slipped further, we put it back. Yet it's still moving down, while other pages on the website are holding strong. Next steps will be to add some schema.org/Event microformats, but beyond that, do you have any ideas?
Intermediate & Advanced SEO | | stevewiideman0 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
How long for Google Webmaster tools to update/reflect link changes
Hi all, Does anyone know or have experience of how long GWMT takes to update its data?, we did some work on our link profile back in October/November but are still seeing old links (removed) showing in GWMT. Thanks in advance,
Intermediate & Advanced SEO | | righty0