Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we need both an .XML Sitemap and a .aspx sitemap?
Hi Mozers, We recently switched servers and it came to my attention that we have two sitemaps a XML version of the sitemap and a .aspx version of the sitemap. This came to light as the .aspx version of the sitemap is causing the site to come to a screeching halt as it has some complex code and lists over 80,000 products. My question is do we need both versions of the sitemap? My understanding is that the XML version is for Search Engine bots and the .aspx version is for customers. I can't imagine that anyone is using our .aspx version as it is basically a page with 80,000 links and it's buried away on the site, so we were hoping to kill off the .aspx version of the sitemap and keep the .xml version for Search Engine Bots. I wanted to check here first to make sure we did not any negative search engine implications. Any help would be most appreciated. Thanks so much! Patrick
Web Design | | gatorpool0 -
Sitemap created on client's Joomla site but it is not showing up on site reports as existing? (Thumbs Up To Answers)
I am working with a web developer who built our client's site in Joomla. I seem to have a lot of issues with Joomla based sites. Any how, the site is www.pitgearusa.com and when we run site reports it is showing there is no xml sitemap. However he used a popular Joomla plugin for sitemaps called Xmap. Here is their url: http://www.jooxmap.com/ Can anyone provide any advice on what the website developer needs to do in order for the xml sitemap to function and "show up" on reports? Thanks Mashed Up
Web Design | | Atlanta-SMO0 -
My 404 page is showing a 4xx error. How can that be fixed?
My actual 404 page is giving a 4xx error.
Web Design | | sbetzen
The page address is http://www.ecowindchimes.com/v/404.asp It loads fine... it is the page all 404's are directed to. Why is it showing a 404 error. The page works. How can this be fixed? Stephen0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Word Press Seo Errors/ Questions
Hi my name is Tina I am new here I hope you guys can help me out. I thought building my new site with Word Press was going to simplify things, however I have a ton of errors, and I am not sure what they are, or how to fix them. I am hoping someone could share with me a solution for these errors. I have 28 rel=canonical errors, I am not sure what this means, I understand it to mean my pages are similar, and this is to set a heirarchy between my pages. Please correct me if I am wrong. If I am correct would this be necessary to add if my main keyword was "widgets" and my home page was optimized for "widgets" and my next page was "blue widgets" and so on. While my pages are similar they are all optimized for different versions of my main keyword some using long tail keywords. Do you know of a plugin that can help solve this problem? Also does anyone have a plugin they recommend for G+ my G+ authorship verification is causing an error as well? I am using Head Space 2 I have used this seo plugin numerous times with great success it has been my favorite seo plugin. However, we have a portfolio that shows our clients websites, and on those pages Head Space will not let me enter a description tag. What plug in do you guys recommend with more control over each page? Another interesting issue is on one of our pages I optimized it for our Canadian clients, and now every page has been listed in Google.ca for the keywords it should have on Google.com. We are listed on Google maps, verified in Google places, and our address is on the site so they know we're from the USA however, the majority of our keywords are only listed in Google.ca. We're on page one for all of them, we are in the top three on most of them so that's not bad, but we want to be listed in Google.com as well. Any suggestions on this?
Web Design | | TinaGammon1 -
URL parameters causing duplicate content errors
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1. I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1. My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters. How can I get SEOmoz to stop reporting these errors?
Web Design | | NiftySon1 -
404 errors
Hey everyone. Appreciate your insight on this. I just finished redesigning my website today and just published it to my server. I decided to go with a real basic html site figuring I may get better results with the search engines. I still have a bunch of optomizing to do but I have a question. Since I was using aspx it is safe to say that many sites will be linked to those old pages. In the interest of not loosing this traffic I told IIS7 to do a 302 redirect to my home page for any 404 errors. Is this the best thing to do or is their a better way? Thanks much Ron
Web Design | | bsofttech0 -
Mobile Sitemap for Site with Media Queries
I'm doing SEO for a site. It uses Media Queries and the CSS to automatically resize the site for the screen size in use. I.e. the site detects the screen size of say an iPhone and the CSS knows which elements to hide for that screen size and still make it look good. This is great because it will automatically cut down the content to display nicely on small screens - obviating the need for a separate mobile site. What kind of sitemap should be generated since the urls are for desktop and mobile use? Yoast (sweet SEO) said it should have both regular and mobile style sitemap to get both the regular and mobile bots to visit, but didn't elaborate on how that sitemap should look. Do you have a recommendation for how exactly the sitemap should look? Should the sitemap have the urls all twice, i.e. once regular and once with the mobile indicator?
Web Design | | GregoryHaze1