Unable to submit sitemap in GWM.. Error
-
I recently published new EMD and installed WP on their. Now i installed the Plugin called Yoast <acronym title="Search Engine Optimization">SEO</acronym> and XML Sitemap. Now whenever i am trying to submit sitemap it shows "URL restricted by robots.txt" but you can see my robots file line written below:
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/But still it is showing same error.. i deactivate plugin and resubmit the sitemap but still no luck.. please help
-
Now google accepted sitemap like for tags and images but now it is showing another error in post_sitemap.xml file .. Check this screenshot - http://i47.tinypic.com/5lxlwj.jpg
-
don't see a problem there. Looks like the sitemap is now being indexed though (according to my spider). Try resubmitting.
-
Here is the .htaccess file
BEGIN W3TC Browser Cache BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4.0[678] no-gzip BrowserMatch \bMSIE !no-gzip !gzip-only-text/html BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html Header append Vary User-Agent env=!dont-vary AddOutputFilterByType DEFLATE text/css application/x-javascript text/x-component text/html text/richtext image/svg+xml text/plain text/xsd text/xsl text/xml image/x-icon <filesmatch ".(css|js|htc|css|js|htc)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml|html|htm|rtf|rtx|svg|svgz|txt|xsd|xsl|xml)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" <filesmatch ".(asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip|asf|asx|wax|wmv|wmx|avi|bmp|class|divx|doc|docx|eot|exe|gif|gz|gzip|ico|jpg|jpeg|jpe|mdb|mid|midi|mov|qt|mp3|m4a|mp4|m4v|mpeg|mpg|mpe|mpp|otf|odb|odc|odf|odg|odp|ods|odt|ogg|pdf|png|pot|pps|ppt|pptx|ra|ram|svg|svgz|swf|tar|tif|tiff|ttf|ttc|wav|wma|wri|xla|xls|xlsx|xlt|xlw|zip)$"=""></filesmatch> FileETag None Header set X-Powered-By "W3 Total Cache/0.9.2.4" # END W3TC Browser Cache # BEGIN W3TC Page Cache core RewriteEngine On RewriteBase / RewriteRule ^(./)?w3tc_rewrite_test$ $1?w3tc_rewrite_test=1 [L] RewriteCond %{HTTPS} =on RewriteRule . - [E=W3TC_SSL:_ssl] RewriteCond %{SERVER_PORT} =443 RewriteRule .* - [E=W3TC_SSL:_ssl] RewriteCond %{HTTP:Accept-Encoding} gzip RewriteRule .* - [E=W3TC_ENC:gzip] RewriteCond %{REQUEST_METHOD} !=POST RewriteCond %{QUERY_STRING} ="" RewriteCond %{HTTP_HOST} =www.freepapertextures.com RewriteCond %{REQUEST_URI} /$ [OR] RewriteCond %{REQUEST_URI} (sitemap(index)?.xml(.gz)?|[a-z0-9-]+-sitemap([0-9]+)?.xml(.gz)?) [NC] RewriteCond %{REQUEST_URI} !(/wp-admin/|/xmlrpc.php|/wp-(app|cron|login|register|mail).php|/feed/|wp-.*.php|index.php) [NC,OR] RewriteCond %{REQUEST_URI} (wp-comments-popup.php|wp-links-opml.php|wp-locations.php) [NC] RewriteCond %{HTTP_COOKIE} !(comment_author|wp-postpass|wordpress[a-f0-9]+|wordpress_logged_in) [NC] RewriteCond %{HTTP_USER_AGENT} !(W3\ Total\ Cache/0.9.2.4) [NC] RewriteCond "%{DOCUMENT_ROOT}/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" -f RewriteRule .* "/wp-content/w3tc/pgcache/%{REQUEST_URI}/_index%{ENV:W3TC_UA}%{ENV:W3TC_REF}%{ENV:W3TC_SSL}.html%{ENV:W3TC_ENC}" [L]# END W3TC Page Cache core# BEGIN WordPressRewriteEngine OnRewriteBase /RewriteRule ^index.php$ - [L]RewriteCond %{REQUEST_FILENAME} !-fRewriteCond %{REQUEST_FILENAME} !-dRewriteRule . /index.php [L] # END WordPress
-
Hmm.. maybe you are blocking robots through htaccess?
-
Still still still same problem.. I cleared all cache through W3.. but still google not detecting sitemap. grrrrrrrrrrr... ..Now google also indexed the page which clearly written robots problem - site:http://www.freepapertextures.com
PLEASE HELP
-
Clear your W3 Total Cache. Your robots.txt looks fine but my spider is picking the sitemap up as "noindex, follow" as well.
-
I uninstalled the xml sitemap plugin.. In Yoast that feature is already enabled but now i untick and tick that feature and click save so now it is no more showing 404 error. Second when i submit sitemap_index.xml link .. google still unable to detect and showing same problem.. you can see the screenshot here - http://prntscr.com/g57h7
-
first i use Yoast sitemap feature so this problem happened than i installed xml sitemap
Uninstall the XML sitemap plugin.. Yoast plugin has a built in XML sitemap which is even better.
Yoast sitemap looks like this: http://www.backlinkbuild.com/blog/sitemap_index.xml
Yours returns a 404: http://www.freepapertextures.com/sitemap_index.xml
Make sure you adjust the WP settings: SEO > XML Sitemaps > Check this box to enable XML sitemap functionality.
and submit this url: http://www.freepapertextures.com/sitemap_index.xml
-
Yes.. first i use Yoast sitemap feature so this problem happened than i installed xml sitemap plugin again the same problem. Setting is correct. You can check my domain robot here - http://freepapertextures.com/robots.txt
-
Are those two separate plugins? I know Yoast has a sitemap plugin built in so you don't need another sitemap plugin.
I've never had any issues with it. Make sure your blog setting is set to 'Allow search engines to index this site.' (Settings > Privacy). Reset your cache (if you have one).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Question (aspx, XML, HTML)
Hey everyone! My company uses a tool called SEOQuake. We are trying to hit all of their "checkmarks" when we run a diagnosis for them. One of the only things we can not figure out how to pass is their section for Site Compliance ---> XML Sitemaps. Our client's websites that we have built are all using .aspx URL structures, and when I view them, it clearly states that it is an XML file. It has this text written at the top of the .aspx page: "This XML file does not appear to have any style information associated with it. The document tree is shown below." Does anyone know what is happening here?
Web Design | | TaylorRHawkins
Thank you!1 -
Canonical and Sitemap issue
Hi all, I was told that I could change my homepage Canonical tag to match that of my XML sitemap, this sitemap is being generated for me automatically and shows the homepage as e.g. https://www.mysite.com/index.html, yet my Canonical tag has been set to https://www.mysite.com. Google currently shows as https://www.mysite.com/ being indexed, but https://www.mysite.com/index.html is not currently displayed in search results. Can someone please tell me if I should change the Canonical to the index.html version, or if I should do nothing, or remove the Canonical tag altogether? Thank you for looking.
Web Design | | scarebearz0 -
Responsive Site has "Not Found" Errors for mobile/ and m/ in Google Search Console
We have recently launched a new responsive website for a client and have noticed 2 "Not Found" errors within Google Search Console for /mobile and /m Both these URLs are not linked from anywhere within the site. However Google is reporting them as being linked from the homepage. This is not the first site we have seen in which Google has reported this error, however the other site was not a mobile friendly site. My thoughts are to 301 them back to the Homepage. Anybody else have any thoughts on this? or have recently received the same errors?
Web Design | | JustinTaylor881 -
Nav / Sitemap Question. Using a "services" page vs just linking directly to individual service page?
Okay, so our company offers video production, web design, and web marketing services. While we do offer these services individually, our goal is to get our clients to integrate these services together. Our nav is currently like so : home - about - video - web design - web marketing - blog - contact Now I've seen businesses and agencies also use a nav with a "services" button instead of listing out their service offerings (if they have more than 1, like us). The services button usually links to a category page or has a drop down with links to the company's individual services. I'm wondering if there is any benefit to having a main services page like this and linking to the individual pages off of it (video ,web design, marketing, etc). Or if we should just keep it the way we have it now (since we've already got some page authority on the individual service pages). I know this may not be the most important aspect of our site and we may be over-thinking it but any thoughts/ideas would be greatly appreciated, thanks!
Web Design | | RenderPerfect0 -
Redirects (301/302) versus errors (404)
I am not able to convincingly decide between using redirects versus using 404 errors. People are giving varied opinions. Here are my cases 1. Coding errors - we put out a bad link a. Some people are saying redirect to home page; the user at least has something to do PLUS more importantly it does NOT hurt your SEO ranking. b. Counter - the page ain't there. Return 404 2. Product removed - link1 to product 1 was out there. We removed product1; so link1 is also gone. It is either lying in people's bookmarks, OR because of coding errors we left it hanging out at some places on our site.
Web Design | | proptiger0 -
'Increase in soft 404 errors' Webmasters notification. What to do?
I've received a Webmasters notification about an 'increase in soft 404 errors'. When we had the new site launched three months ago we did away with some old pages, which we either 301 to new equivalents, or, we return a 'Oops, that page seems to be missing' 404 page which has some links to important parts on the site that might be of use to the visitor. Any ideas why Webmasters is issuing the warning? Any suggestions as to what to do? Thanks
Web Design | | Martin_S0 -
Word Press Seo Errors/ Questions
Hi my name is Tina I am new here I hope you guys can help me out. I thought building my new site with Word Press was going to simplify things, however I have a ton of errors, and I am not sure what they are, or how to fix them. I am hoping someone could share with me a solution for these errors. I have 28 rel=canonical errors, I am not sure what this means, I understand it to mean my pages are similar, and this is to set a heirarchy between my pages. Please correct me if I am wrong. If I am correct would this be necessary to add if my main keyword was "widgets" and my home page was optimized for "widgets" and my next page was "blue widgets" and so on. While my pages are similar they are all optimized for different versions of my main keyword some using long tail keywords. Do you know of a plugin that can help solve this problem? Also does anyone have a plugin they recommend for G+ my G+ authorship verification is causing an error as well? I am using Head Space 2 I have used this seo plugin numerous times with great success it has been my favorite seo plugin. However, we have a portfolio that shows our clients websites, and on those pages Head Space will not let me enter a description tag. What plug in do you guys recommend with more control over each page? Another interesting issue is on one of our pages I optimized it for our Canadian clients, and now every page has been listed in Google.ca for the keywords it should have on Google.com. We are listed on Google maps, verified in Google places, and our address is on the site so they know we're from the USA however, the majority of our keywords are only listed in Google.ca. We're on page one for all of them, we are in the top three on most of them so that's not bad, but we want to be listed in Google.com as well. Any suggestions on this?
Web Design | | TinaGammon1 -
URL parameters causing duplicate content errors
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1. I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1. My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters. How can I get SEOmoz to stop reporting these errors?
Web Design | | NiftySon1