Help needed with robots.txt regarding wordpress!
-
Here is my robots.txt from google webmaster tools. These are the pages that are being blocked and I am not sure which of these to get rid of in order to unblock blog posts from being searched.
http://ensoplastics.com/theblog/?cat=743
http://ensoplastics.com/theblog/?p=240
These category pages and blog posts are blocked so do I delete the /? ...I am new to SEO and web development so I am not sure why the developer of this robots.txt file would block pages and posts in wordpress. It seems to me like that is the reason why someone has a blog so it can be searched and get more exposure for SEO purposes.
IS there a reason I should block any pages contained in wodrpress?
Sitemap: http://www.ensobottles.com/blog/sitemap.xml
User-agent: Googlebot
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /?
Disallow: /*?
Disallow: /page/
User-agent: *Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/themes/
Disallow: /trackback
Disallow: /commentsDisallow: /feed
-
I've just looked at the home pages of the two sites and they are pretty much the same apart from substituting plastics with bottles. I'm not an expert but I would have thought Google might think this is duplicate content.
In my opinion I would concentrate on one of the sites say plastics and have the bottle specific stuff as a subsection. I'm not sure how the sites rank etc so that may be easier said than done.
As for the site map / robot question, if you continue with two sites then I would recommend generating a new one for the copied site.
-
So basically this site was duplicated and apparently the robots.txt file was duplicated. There is no sitemap for the blog created for the enso plastics site, so I am not sure how to proceed at this point. Should I just create a new robots.text file for enoplastics and replace this one? Or do I edit this one, and go create a sitemap for my blog?
-
Well that is a problem isn't it? Like I said I am new to a lot of this and I didn't develop either site, this robot.txt file is pointing to the wrong site map. So I am going to change that.
However I am guessing I may need to change some of the rules to get it to where it is not blocking wordpress content.
-
Well that is a problem isn't it? Like I said I am new to a lot of this and I didn't develop either site, this robot.txt file is pointing to the wrong site map. So I am going to change that.
However I am guessing I may need to change some of the rules to get it to where it is not blocking wordpress content.
-
I'm a bit confused. You reference ensoplastics.com up the top and then show the robots text from ensobottles.com
Are they using the same robots content? The sites use different url naming so ensobottles uses rewrite whereas the other site uses ?p=
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pagination Help
Hi Moz Community, I've recently started helping a new site with their overall health and I have some pagination issues. It's an ecommerce site and they currently don't have any pagination in place except for these tags: Prev 1 2 3 ... 66 Next I understand what these are doing (leading visitors to the previous, next or last page, but do these do anything for search crawlers or does the site need to have an option of:
Technical SEO | | IceIcebaby
1.rel=next/rel=prev
2.canonical leading to the view all page (the view all page takes a long time to load) Thanks for your help. -Reed0 -
Robots.txt | any SEO advantage to having one vs not having one?
Neither of my sites has a robots.txt file. I guess I have never been bothered by any particular bot enough to exclude it. Is there any SEO advantage to having one anyways?
Technical SEO | | GregB1230 -
Is there any value in having a blank robots.txt file?
I've read an audit where the writer recommended creating and uploading a blank robots.txt file, there was no current file in place. Is there any merit in having a blank robots.txt file? What is the minimum you would include in a basic robots.txt file?
Technical SEO | | NicDale0 -
Do I need an XML sitemap?
I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so?
Technical SEO | | pugh0 -
EzineArticles WordPress Plugin
Any thoughts on the EzineArticles plugin for WordPress? I read that it provides "the ability to simultaneously publish new posts to the web and submit them as articles to EzineArticles.com" Could this lead to duplicate content penalties?
Technical SEO | | martyc0 -
Confused about robots.txt
There is a lot of conflicting and/or unclear information about robots.txt out there. Somehow, I can't make out what's the best way to use robots even after visiting the official robots website. For example I have the following format for my robots. User-agent: * Disallow: javascript.js Disallow: /images/ Disallow: /embedconfig Disallow: /playerconfig Disallow: /spotlightmedia Disallow: /EventVideos Disallow: /playEpisode Allow: / Sitemap: http://www.example.tv/sitemapindex.xml Sitemap: http://www.example.tv/sitemapindex-videos.xml Sitemap: http://www.example.tv/news-sitemap.xml Is this correct and/or recommended? If so, then how come I see a list of over 200 or so links blocked by robots when Im checking out Google Webmaster Tools! Help someone, anyone! Can't seem to understand this robotic business! Regards,
Technical SEO | | Netpace0 -
What is with WordPress Dupe issues?
Hi, Just wondering if anyone can explain for me why it seems every tag that is entered in WP blog posts on a site creates a duplicate page (identified by ROGER and friends in SEOmoz crawl)? Obviously if you can offer a solution (apart from the extremely obvious "don't use tags") I would be immensely grateful. Thanks so much,
Technical SEO | | ShaMenz0 -
404 help
Hello all, firstly let me apologize if this is the wrong place to ask this question. I have a site www.promptresponseaccidentmanagement.com which gets a 200ok when checked for crawl issues, however pages such as /whiplash-injury-compensation-claims.php , /road-traffic-accident-compensation-claims.php and quite a few more return a 404. That's fine (usually) as I can quite happily fix that most of the time. However if you actually go to those pages in your browser, or click through to them on any part of the site you will see that they are in fact not redirecting to a 404 and everything is fine!? Any body got any ideas? Best H
Technical SEO | | haydyn0