Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is the User Sitemap dead?
There's a discussion going on in our office about sitemaps. I thought it'd be a good idea to get the thoughts of the Moz community in on it, too. What are your thoughts? is the User Sitemap still an effective tool to utilize?
Web Design | | TaylorRHawkins1 -
Responsive Site has "Not Found" Errors for mobile/ and m/ in Google Search Console
We have recently launched a new responsive website for a client and have noticed 2 "Not Found" errors within Google Search Console for /mobile and /m Both these URLs are not linked from anywhere within the site. However Google is reporting them as being linked from the homepage. This is not the first site we have seen in which Google has reported this error, however the other site was not a mobile friendly site. My thoughts are to 301 them back to the Homepage. Anybody else have any thoughts on this? or have recently received the same errors?
Web Design | | JustinTaylor881 -
Site structure and Visual Sitemaps
Aside from mind mapping software are there any tools ( recommended) to build a visual sitemap of the internal linking structure of a URL? I've been trying to 'show' clients the structure of a website as it pertains to internal and external links. Here is one I've tried it's "Close" - http://site-visualizer.com/ . I've been using the excel export function, import into mind meister and building it. It's a teeny bit time consuming for large websites. Site structure I feel is a valuable portion of SEO and a down and dirty visual explanation would be great. Don't get me wrong, it offers other benefits as well- it's just I'd like to free up the time it takes. Thank you in advance. Screen shots are available on the website of the organization.
Web Design | | TammyWood0 -
Sitemap Update Frequency?
Hello, My question today is regarding sitemaps. I'm often confused by this and because I am a bit obsessive I believe I may be giving myself more work than needed.. Basically my question is, do I need to update and/or re-generate my sitemap every time I make a change to the site? I mean, I must have to if I add a page, correct? And so in Google's Webmaster Tools, do I just delete the current sitemap and re-upload a new one for Google to crawl? Is it possible to overdo this? Any sitemap suggestions would be fantastic. I feel like there's been a few weeks where I've updated the sitemap daily and re-submitted it and I worry that might be hurting my site. Thanks!
Web Design | | jesse-landry0 -
Unable to set preferred domain, can I verify a site that's already redirected?
I'm in the process of trying to set a preferred domain in webmaster tools -- to set our www version as preferred vs. the non www. version. IT is already redirecting non-www to www, but I get this message when trying to change settings "Part of the process of setting a preferred domain is to verify that you own http://mnn.com/. Please verify http://mnn.com/." While we own the domain, I am not sure how we can have Google access a file at [http://mnn.com/some_file when we are forwarding all requests for non-www to our www site.
Web Design | | Aggie
Note: The apache rewrite predates me and I'm not sure how / why we have two domains set up, but I'm trying to fix the preferred domain now.Am I able to verify the non version once the redirect is in place.Any ideas??? Help???Thanks!Lisa0 -
Which Content Causes Duplicate Content Errors
My Duplicate Content list starts off with this URL: http://www.nebraskamed.com/about-us/branding/bellevue-medical-center-logo Then it lists the five below as Duplicate Content: http://www.nebraskamed.com/about-us/branding/fonts http://www.nebraskamed.com/about-us/branding/clear-zone http://www.nebraskamed.com/about-us/social-media http://www.nebraskamed.com/about-us/branding/order-stationery http://www.nebraskamed.com/about-us/branding/logo I do notice that most of these pages have images and/or little or no content outside of our sites template. Is this causing SEOmoz to see it as duplicate? Should I use noindex, follows to fix this? This error is happening with branding pages so noindex is an option. What should I do if that's not an option? Should I change our mega menus to be ajax driven to so the links aren't showing up in the code of every page?
Web Design | | Patrick_at_Nebraska_Medicine0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Are HTML sitemaps still in use today?
I'm trying to help a client understand the importance of having a well-organized HTML site map as a method of helping usability. As part of this process, I spent some time searching for good examples of well-organized HTML site maps, and found that many sites don't offer one (including SEOmoz). I'm wondering if webmasters and/or SEOers think they aren't valuable any longer?
Web Design | | EricVallee340