403 error-How to fix it?
-
Got 36 "403 errors".
Google some stuff and also look into SEOMOZ nothing relevant.
The site is wordpress latest version and host is Godaddy.
I have recently added these URL's in robots.txt file and they were removed but because of some issue in robots.txt file I have to revert it and make it blank.
Kindly guide me a permanent remedy for it?
-
.ifieds/ads/new-nike-nfl-jerseys-2012-299434-6il/
forum/viewtopic.php?f=11&%3C/td%3E%20%20%3C/tr%3E%20%20%3Ctr%3E%3Ctd%20align=
The above 2 are the example URL they are 36 in total.
Adam what If I apply delete option and throw the stuff out of index?
-
What are the URLs that are triggering a 403 error?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I get a - 'Temporarily unreachable' error message when I 'Fetch as Google' Any ideas please??
I wanted to Fetch this page and got this error from Google - Temporarily unreachable. I've never had this issue before?? I checked another page and it came back as 'Complete', so no problems there? Any ideas? Thank you in advance.
Reporting & Analytics | | MissThumann0 -
Fixing Bounce Rate between Domain and Subdomain
Currently, the way our site is set up, our clients generally visit our homepage and then login through a separate page that is a subdomain, or they can read our blog/support articles that are also on separate subdomains. From my understanding, this can be counted as a bounce, and I know this sorta of site structure isn't ideal, but with our current dev resources and dependencies, fixing this isn't going to happen overnight. Regardless, what would be the easiest way to implement this fix witihn the Google Analytics code? EX: If someone visits our site at X.com, and then wants to login at portal.X.com, I don't want to count that as a bounce. Any insight is appreciated! Thanks
Reporting & Analytics | | KathleenDC0 -
How do I fix 608's please?
Hi, I'm on the free trial and finding it very useful I've fixed all my 301's. but now I have a load of 608's. I don't no what this is! I feel like I've cured herpes only to get gonorrhea! can any one help. I have 41 608's which is more than the 301's I had. I hope they are non-related! I won't bore you with the whole list but some of the url's are: Error Code 608: Page not Decodable as Specified Content Encoding http://sussexchef.com/catering-at-mr-mrs-currys-50th-wedding-anniversary/guestsarrive----608 Error Code 608: Page not Decodable as Specified Content Encoding http://sussexchef.com/funeral-catering/picture4-2----608 Error Code 608: Page not Decodable as Specified Content Encoding http://sussexchef.com/wedding-venues
Reporting & Analytics | | SussexChef831 -
Google Webmaster indicates robots.text access error
Seems that Google has not been crawling due to an access issue with our robots.txt
Reporting & Analytics | | jmueller0823
Late 2013 we migrated to a new host, WPEngine, so things might have changed, however this issue appears to be recent. A quick test shows I can access the file. This is the Google Webmaster Tool message: http://www.growth trac dot com/: Googlebot can't access your site January 17, 2014 Over the last 24 hours, Googlebot encountered 62 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 8.8% Note the above message says 'over the last 24 hours', however the date is Jan-17 This is the response from our host:
Thanks for contacting WP Engine support! I looked into the suggestions listed below and it doesn't appear that these scenarios are the cause of the errors. I looked into the server logs and I was only able to find 200 server responses on the /robots.txt. Secondly I made sure that the server wasn't over loaded. The last suggestion doesn't apply to your setup on WP Engine. We do not have any leads as to why the errors occurred. If you have any other questions or concerns, please feel free to reach out to us. Google is crawling the site-- should I be concerned? If so, is there a way to remedy this? By the way, our robots file is very lean, only a few lines, not a big deal. Thanks!0 -
Webmaster Tools Error: Unreachable page
Hi all, When I try to the "Fetch as Google" feature on Webmaster Tools, I get the error Unreachable page. I checked the Google Analytics code, everything seems to be OK. What should I do?
Reporting & Analytics | | fisniks0 -
Wordpress site with increase number of Crawl(400 response Code) errors in Others section of GWT
I have a wordpress site http://muslim-academy.com/I check in Google Webmasters tool today and I see the increase number of errors in Others area of Google webmaster Tool.The error code is 400http://muslim-academy.com/%D8%B3%D9%8A%D8%B1%D8%A9-%D8%AA%D8%A7%D8%B1%D9%8A%D8%AE%D9%8A%D8%A9-%D9%84%D9%84%D8%B1%D8%A6%D9%8A%D8%B3-%D8%AC%D9%85%D8%A7%D9%84-%D8%B9%D8%A8%D8%AF-%D8%A7%D9%84%D9%86%D8%A7%D8%B5%D8%B1-2/%D8%B3%D9%....%3Cbr%20/%3E________________%3Cbr%20/%3E___________%3Ca%20href=?lang=zhOne of the example link of this error.Can you guide me why the number of errors are increasing and how to fix the existing errors.
Reporting & Analytics | | csfarnsworth0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
Spider 404 errors linked to purchased domain
Hi, My client purchased a domain which based on the seller "promising lots of traffic". Subsequent investigation showed it was a scam and that the seller had been creative in Photoshop with some GA reports. Nevertheless, my client had redirected the acquired domain to their primary domain (via the domain registrar). From the period on which the acquired domain was redirected to the point when we removed the redirect, the web log files had a high volume of spider/bot 404 errors relating to an online pharmaacy - viagra, pills etc. The account does not seem to have been hacked. No additional files are present and the rest of the logs seem normal. As soon as the redirect was removed the spider 404 errors stopped. Aside from the advice about acquiring domains promising traffic which I've already discussed with my client, does anybody have any ideas about how a redirect could cause the 404 errors? Thanks
Reporting & Analytics | | bjalc20110