Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Correct robots.txt for WordPress
-
Hi. So I recently launched a website on WordPress (1 main page and 5 internal pages). The main page got indexed right off the bat, while other pages seem to be blocked by robots.txt. Would you please look at my robots file and tell me what‘s wrong?
I wanted to block the contact page, plugin elements, users’ comments (I got a discussion space on every page of my website) and website search section (to prevent duplicate pages from appearing in google search results). Looks like one of the lines is blocking every page after ”/“ from indexing, even though everything seems right.
Thank you so much.
-
Me too, can you upload or screenshot the actual file that you are using
-
I have edited it down to
User-Agent: * Allow: /wp-content/uploads/ Disallow: /wp-content/plugins/ Disallow: /wp-admin/ Disallow: /contact/ Disallow: /refer/ It didn’t help. I get a “Blocked by robots.txt” message after submitting the URL for indexing in google webmaster tools. I’m really puzzled.
-
Hi, in addition to the answer that effectdigital gave; another option,optimised for WordPress:
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml -
Just seems overly complex and like there's way more in there than there needs to be
I'd go with something that 'just' does what you have stated that you want to achieve, and nothing else
User-Agent: *
Disallow: /wp-content/plugins/
Disallow: /comments
Disallow: /*?s=
Disallow: /*&s=
Disallow: /search
See if that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Need Wordpress Front-end Plugin With Moz API
Hi Guys,
On-Page Optimization | | mrezair
I'm looking for Moz SEO Front-end Wordpress Plugin To Audit My Visitors Website And Show Results in my site. Like This plugin for Moz DA Checker: https://www.724ws.net/domain-authority-checker/ It's not important to be a free plugin or Premium one. I need to increase leads and traffic by it. Any suggestion will be appreciated.0 -
Meta Robots index & noindex Both Implemented on Website
I don't want few of the pages of website to get indexed by Google, thus I have implemented meta robots noindex code on those specific pages. Due to some complications I am not able to remove meta robots index from header of every page Now, on specific pages I have both codes 'index & noindex' implemented. Question is: Will Google crawl/index pages which have noindex code along with index code? Thanks!
On-Page Optimization | | Exa0 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Best way to separate blogs, media coverage, and press releases on WordPress?
I'm curious what some of your thoughts are on the best way to handle the separation of blog posts, from press releases stories, from media coverage. With 1 WordPress installation, we're obviously utilizing the Posts for these types of content. It seems obvious to put press releases into a "press release" category and media coverage into a "media coverage" category.... but then what about blog posts? We could put blog posts into a "blog" category, but I hate that. And what about actual blog categories? I tried making sub-categories for the blog category which seemed like it was going to work, until the breadcrumbs looked all crazy. Example: Homepage > Blog > Blog > Sub-Category Homepage = http://www.example.com First 'Blog' = http://www.example.com/blog Second 'Blog' = http://www.example.com/category/blog Sub-Category = http://www.example.com/category/blog/sub-category This just doesn't seem very clean and I feel like there has to be a better solution to this. What about post types? I've never really worked with them. Is that the solution to my woes? All suggestions are welcome! EDIT: I should add that we would like the URL to contain /blog/ for blog posts /media-coverage/ for media coverage, and /press-releases/ for press releases. For blog posts, we don't want the sub-category to be in the URL.
On-Page Optimization | | Philip-DiPatrizio0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Wordpress: Should I NO INDEX Categories & Archives Pages?
I am new to SEOmoz & trying to work my way through the ca-trillion errors that have been found on my site, but for each one I want to ensure that I am helping rather than harming my site. The tool has (as a "notice") said that my category pages & Archives are NO-INDEX, is this how these pages should be dealt with? In addition, the crawler has also (as a "warning error) discovered that my categories, and Archives do not have a meta description..is this of great importance for non indexed pages of this type? Thanks so much to the SEOmoz forum members, you have so far been of invaluable help to me.
On-Page Optimization | | KMack2 -
How to optimize a wordpress blog
I’m helping a client optimize a word press blog, and I’m not that familiar with Wordpress. The site is www.athleticfoodie.com. At first I was treating it like a normal website, where the categories would be optimized like pages on a website. However, I now realize that categories don’t have any content on them, so I can’t really optimize anything other than the names. Are the following things the best way to handle on-page optimization for a blog? Optimizing the homepage & domain: Find ways to incorporate the most important keywords into the elements on the main frame of the site: Navigation menu, Widgets, Category names, Alt Images. Optimizing the categories: For the posts within the categories (i.e., photos), work to make sure the category keywords are worked into the post titles (but not too much to seem spammy) Optimizing specific posts. Work keywords into the text and images. Any other suggestions would be greatly appreciated.
On-Page Optimization | | EricVallee340