Does SeoMoz realize about duplicated url blocked in robot.txt?
-
Hi there:
Just a newby question...
I found some duplicated url in the "SEOmoz Crawl diagnostic reports" that should not be there.
They are intended to be blocked by the web robot.txt file.
Here is an example url (joomla + virtuemart structure):
http://www.domain.com/component/users/?view=registration
and the here is the blocking content in the robots.txt file
User-agent: *
_ Disallow: /components/_
Question is:
Will this kind of duplicated url errors be removed from the error list automatically in the future?
Should I remember what errors should not really be in the error list?
What is the best way to handle this kind of errors?
Thanks and best regards
Franky
-
Hello Franky,
Yes, our crawler obeys robots.txt files. If you recently made that change to your robots then it should reflect in your next crawl. If this error doesn't go away, feel free to let us know help@seomoz.org. Thanks for letting us know!
-Abe
-
Don't be too worried about SEOMOZ's errors. Just be aware of them, and if you have done what you need to for the robots file in regards to S.E robots, they should take notice and there shouldn't be any issues. Always be sure to check GWT for errors, those are the ones you should fix asap.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Site Content found in Moz; Have a URL Parameter set in Google Webmaster Tools
Hey, So on our site we have a Buyer's Guide that we made. Essentially it is a pop-up with a series of questions that then recommends a product. The parameter ?openguide=true can be used on any url on our site to pull this buyer's guide up. Somehow the Moz Site Crawl reported each one of our pages as duplicate content as it added this string (?openguide=true) to each page. We already have a URL Parameter set in Google Webmaster Tools as openguide ; however, I am now worried that google might be seeing this duplicate content as well. I have checked all of the pages with duplicate title tags in the Webmaster Tools to see if that could give me an answer as to whether it is detecting duplicate content. I did not find any duplicate title tag pages that were because of the openguide parameter. I am just wondering if anyone knows:
Moz Pro | | MitchellChapman
1. a way to check if google is seeing it as duplicate content
2. make sure that the parameter is set correctly in webmaster tools
3. or a better way to prevent the crawler from thinking this is duplicate content Any help is appreciated! Thanks, Mitchell Chapman
www.kontrolfreek.com0 -
How to fix overly dynamic URLs for Volusion site?
We're currently getting over 5439 pages with an 'overly dynamic URL' warning in our Moz scan. The site is run on Volusion. Is there a way to fix this seeming Volusion error?
Moz Pro | | Brandon_Clay0 -
Duplicate page title
Hello my page has this Although with seomoz crawl it says that this pages has duplicate titles. If my blog has 25 pages, i have according seomoz 25 duplicate titles. Can someone tell me if this is correct or if the seomoz crawl cannot recognize rel="next" or if there is another better way to tell google when there a pages generated from the blog that as the same title Should i ignore these seomoz errors thank you,
Moz Pro | | maestrosonrisas0 -
How to remove URLS from from crawl diagnostics blocked by robots.txt
I suddenly have a huge jump in the number of errors in crawl diagnostics and it all seems to be down to a load of URLs that should be blocked by robots.txt. These have never appeared before, how do I remove them or stop them appearing again?
Moz Pro | | SimonBond0 -
Seomoz keyword difficulty api
Is there a way to get this from an api or a better method than pasting 5 keywords at a time into the keyword difficulty tool to get these numbers?
Moz Pro | | insitegoogle0 -
Seomoz crawling filtered pages
Hi, I just checked an seo campaign we started last week, so I opened seomoz to see the crawl diagnostics. Lot's of duplicate content & duplicate titles showing up, but that's because Rogerbot is crawling all of the filtered pages as well. How do I exclude these pages from being crawled? /product/brand-x/3969?order=brand&sortorder=ASC
Moz Pro | | nvs.nim
/product/brand-x/3969?order=popular&sortorder=ASC
/product/brand-x/3969?order=popular&sortorder=DESC&page=10
/product/brand-x/3969?order=popular&sortorder=DESC&page=110 -
Recommend SEOmoz PRO Tools on LinkedIn
Hey Moz Community, Because honest user reviews are the best way to inform people about SEOmoz PRO tools and benefits, we'd love for those of you who are on LinkedIn to leave a recommendation for our product: http://mz.cm/urHa1e If you do choose to leave a review, please be honest in what you say. Even if it's not 100% hearts & flowers, we'd rather you keep it real. Thanks!
Moz Pro | | EricaMcGillivray0 -
Too many pages indexed in SEOMoz
I am running a campaign for a client that has 86 pages via Google and SEmoz is up to almost 10K pages. I am really confused. Any ideas?
Moz Pro | | LaurieK130