Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Errors In Search Console
-
Hi All,
I am hoping someone might be able to help with this.
Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page).
When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop.
These are:
wp-admin/admin-ajax.php showing as response code 400 and also
xmlrpc.php showing as response code 405
robots.txt is as follows:
user-agent: *
disallow: /wp-admin/
allow: /wp-admin/admin-ajax.php
Any help with what is wrong here and how to fix it would be greatly appreciated.
Many Thanks
-
Nope, sorry, that site doesn't have Jetpack installed.
-
-
Checked the xmlroc file and it was identical to a new download.
Fetch as Googlebot returns an almost instant error message.
-
Did the error come up again on console? I would recommend resubmitting the site after fixing the errors and wait for some time.
-
Unfortunately this did not work.
I am absolutely baffled by it. Siteground (the hosts) dont have a clue either.
-
Thanks Vijay,
Yes my thought was that would be the killer for the site. Thanks for the link, i will check it out.
Thanks
Dale
-
Hi There,
The problem with the errors that appeared in the Google Console might be due to wordpress. But the errors would have made the content and structure of the website look very different or even not available for the bots. You should focus on fixing the following error and re-submit the site.
xmlrpc.php showing as response code 405
A possible solution might be available here https://mobilunity.com/q-n-a/how-to-fix-error-405-in-xmlrpc-php/
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should search pages be indexed?
Hey guys, I've always believed that search pages should be no-indexed but now I'm wondering if there is an argument to index them? Appreciate any thoughts!
Technical SEO | | RebekahVP0 -
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
Spam URL'S in search results
We built a new website for a client. When I do 'site:clientswebsite.com' in Google it shows some of the real, recently submitted pages. But it also shows many pages of spam url results, like this 'clientswebsite.com/gockumamaso/22753.htm' - all of which then go to the sites 404 page. They have page titles and meta descriptions in Chinese or Japanese too. Some of the urls are of real pages, and link to the correct page, despite having the same Chinese page titles and descriptions in the SERPS. When I went to remove all the spammy urls in Search Console (it only allowed me to temporarily hide them), a whole load of new ones popped up in the SERPS after a day or two. The site files itself are all fine, with no errors in the server logs. All the usual stuff...robots.txt, sitemap etc seems ok and the proper pages have all been requested for indexing and are slowly appearing. The spammy ones continue though. What is going on and how can I fix it?
Technical SEO | | Digital-Murph0 -
Abnormally high internal link reported in Google Search Console not matching Moz reports
If I'm looking at our internal link count and structure on Google Search Console, some pages are listed as having over a thousand internal links within our site. I've read that having too many internal links on a page devalues that page's PageRank, because the value is divided amongst the pages it links out to. Likewise, I've heard having too many internal links is just bad in general for SEO. Is that true? The problem I'm facing is determining how Google is "discovering" these internal links. If I'm just looking at one single page reported with, say, 1,350 links and I'm just looking at the code, it may only have 80 or 90 actual links. Moz will confirm this, as well. So why would Google Search Console report different? Should I be concerned about this?
Technical SEO | | Closetstogo0 -
Increase 404 errors or 301 redirects?
Hi all, I'm working on an e-commerce site that sells products that may only be available for a certain period of time. Eg. A product may only be selling for 1 year and then be permanently out of stock. When a product goes out of stock, the page is removed from the site regardless of any links it may have gotten over time. I am trying to figure out the best way to handle these permanently out of stock pages. At the moment, the site is set up to return a 404 page for each of these products. There are currently 600 (and increasing) instances of this appearing on Google Webmasters. I have read that too many 404 errors may have a negative impact on your site, and so thought I might 301 redirect these URLs to a more appropriate page. However I've also read that too many 301 redirects may have a negative impact on your site. I foresee this to be an issue several years down the road when the site has thousands of expired products which will result in thousands of 404 errors or 301 redirects depending on which route I take. Which would be the better route? Is there a better solution?
Technical SEO | | Oxfordcomma0 -
How to search HTML source for an entire website
Is there a way for me to do a "view source" for an entire website without having to right-click every page and select "view source" for each of them?
Technical SEO | | SmartWebPros0 -
Site disappearing from search for a certain keyword
I was wondering if someone has encountered the same problem as me. I was doing some changes on the frontpage of one of my clients' website, especially some redirections, and my site has disappeared from Google for the main keyword on the page. So, if I look for my page on Google, instead of seeing my page first, I no longer see my page, at all. All I've done was a 301 redirection from index.html to the domain name. Now, I changed everything back to how it was before. More precisely, I've done that 2 weeks ago. But, no change in Google. I checked Bing and Yahoo, my site appears first when I search for that specific keyword. Any ideas how long will it take for Google to see that I am not doing anything wrong with redirections? Or any idea at all?
Technical SEO | | webmasterles0 -
How to stop Search Bot from crawling through a submit button
On our website http://www.thefutureminders.com/, we have three form fields that have three pull downs for Month, Day, and year. This is creating duplicate pages while indexing. How do we tell the search Bot to index the page but not crawl through the submit button? Thanks Naren
Technical SEO | | NarenBansal0