Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Question (aspx, XML, HTML)
Hey everyone! My company uses a tool called SEOQuake. We are trying to hit all of their "checkmarks" when we run a diagnosis for them. One of the only things we can not figure out how to pass is their section for Site Compliance ---> XML Sitemaps. Our client's websites that we have built are all using .aspx URL structures, and when I view them, it clearly states that it is an XML file. It has this text written at the top of the .aspx page: "This XML file does not appear to have any style information associated with it. The document tree is shown below." Does anyone know what is happening here?
Web Design | | TaylorRHawkins
Thank you!1 -
Sitemap Wordpress
My sitemap in wordpress is showing up like this http://arowautorental.com/sitemap/ but i got parents page on th website and i dont see it in the sitemap, how can i fix this?
Web Design | | dawgroup0 -
404 error on phone numbers
Hi, I'm receiving a 404 error on my callto: phone number and wondered if there's a way to fix the problem. We've not experienced it before so I'm not sure if it's something to do with the crawl? Any help massively appreciated! Thanks Anne
Web Design | | SeeGreen0 -
4XX (Client Error) on Wordpress Wesbite
I've just taken over the management of a website and am getting 4x 4XX (client Error) issues. Example: http://inter-italia.com/en/wp-login.php?action=lostpassword Can anyone give any guidance as how to fix this wordpress? I also see a lot of 'temporary redirects' due to multilingual plugin - is there anything I can do to fix this?
Web Design | | skehoe0 -
How to correct error in customized posttype WP site
Hi folks Can anybody help me. I foolishly, dogedly followed a Lynda.com tutorial for developing an 'online portfolio in WP'. Little did I know that my initial assumption - to use the 'twenty twelve' rather than the 'twenty eleven' theme would land me in such deep water. I was attempting to learn php on my own. All went well, until, --- the index page for the customized post type. Now I have two beautiful customized posttypes, 'companies' 'coverage' and no idea how to create an index page for either. I can't do the next step! I have tried every permutation - changing the permalink settings, changing them back, desperately searching for any handle to the nebulous links within the menu section. The only thing I can do (and have done for now) is to link the menu item 'company' and the menu item 'coverage' to a single post. Then the poor visitor has to scroll through the posts individually. I tried contacting the tutor and Lynda.com, to no avail! I have searched forums and found this is a common problem, but because I am so confused and novice to php they might as well be speaking Chinese. To compound my problems, looking through 'Wordpress SEO' for Yoast, I am painfully aware I can't go to the first basic step and fix the peramilinks to 'Postname' as that just makes my flakey menu collapse like a pack of cards. Help!
Web Design | | catherine-2793880 -
Redirects (301/302) versus errors (404)
I am not able to convincingly decide between using redirects versus using 404 errors. People are giving varied opinions. Here are my cases 1. Coding errors - we put out a bad link a. Some people are saying redirect to home page; the user at least has something to do PLUS more importantly it does NOT hurt your SEO ranking. b. Counter - the page ain't there. Return 404 2. Product removed - link1 to product 1 was out there. We removed product1; so link1 is also gone. It is either lying in people's bookmarks, OR because of coding errors we left it hanging out at some places on our site.
Web Design | | proptiger0 -
Why is there no sitemap.xml for SEOmoz?
I noticed that SEOmoz does not have a "root" sitemap called sitemap.xml. On the other hand, there do appear to be sitemaps for various sections of the such as http://www.seomoz.org/blog-sitemap.xml I was planning on having a root level sitemap that referenced difference sections of my site (blog, support, etc.) but I'm a little concerned that this site itself doesn't seem to be following that practice. Presumably this website is submitting the individual section maps to Google directly since they aren't linkable through sitemap.xml?
Web Design | | schof0 -
Getting tons of duplicate content and title errors on my asp.net shopping cart, is there way to resolve this?
The problem I am having is that the web crawlers are seeing all my category pages as the same page thus creating duplicate content and duplicate title errors. At this time I have 270 of these critical errors to deal with. Here is an example: http://www.baysidejewelry.com/category/1-necklaces.aspx http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=1 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=2 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=3 http://www.baysidejewelry.com/category/1-necklaces.aspx?pageindex=4 All of these pages are see as the same exact page by the crawlers. Because these pages are generated by a SQL database I don't have a way I know of to fix it.
Web Design | | bsj20020