Webmaster Tools Data Discrepancies/Anomalies
-
Hi
Im looking in GWT account for a client of mine and see that in the index status area 400+ pages are indexed so all seems ok there! But then in the sitemap area 111 pages have been submitted but only 1 indexed !
Any ideas whats going on here ?
Cheers
Dan
-
the moz bar says they are 301'd so presume so but will double check with dev
i suppose if all definately are 301'd for sure, then just wait and see what haps after the sitemap is updated correctly and then take it from there
-
It's likely they used 301 then if that's what they said. You can check using an HTTP header tool. Uppercase in the Google results is odd as well. You'll want to double check this stuff and get the sitemap changed to help prompt the use of lowercase instead.
-
Re: 302 vs 301 do you mean there's a possibility dev rewrote via a 302 instead of a 301 (they had told me used a 301) ?
Re mozscape not updating, I just checked G search results and upper case versions indexed there and last cache date was 12 march
-
Possibly due to 302 vs 301 redirects. Those pages still load. Mozscape didn't update those pages from their previous crawls yet. Fixing the sitemap will help, further internal and external links pointing to the lowercase URL versus the uppercase will help too.
-
Grt thanks !!
Sorry 1 final question since just noticed another related anomaly
The urls have been redirected to lower case for over a month now but the urls showing in ranking reports are mainly still upper case versions, how can that be ?
Cheers
Dan
-
Yup. Update the site map with your current and accurate URLs and you should be on your way.
-
thanks for confirming that Ryan !
So use 301 as the method for 'rewriting' upper to lower case, so the 301 is the rewrite ?
They had told me they had already applied 301's of upper to lower case so i guess its probably just a case of updating the site map urls to resolve this ?
all best
dan
-
It certainly could. You'd also want dev to use 301 redirection when applying a change like that. The sitemap isn't a hard fast rule for what you want indexed in the search engines, rather an aid to help them crawl your site. If it's in error they'll still be able to crawl and index more than what's listed on the sitemap. Ideally though, your sitemap is an accurate reflection of the pages of your site.
-
thanks Ryan
I will do that , but just to confirm then re the contradictory messages, is that because there are 400+ pages indexed hence showing that info in the index status, BUT in the sitemap there's only the home page showing because all the other urls in the sitemap have probably changed ?
I had instructed dev to rewrite upper case urls to lowercase and amend the sitemap. If the have done one but not the other could this be the sort of thing causing this discrepancy in the GWT messaging ?
All Best
Dan
-
I would double check the sitemap file to see if it's still accurate or if the site has changed and the URLs in the sitemap no longer correspond to the structure of the site. My guess is that the root domain is the only one of the 111 that is correctly listed. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What server issues might cause temporary and repeated Soft 404/500 Errors that appear to be functioning correctly when checked later from Google Webmaster Tools?
We are experiencing unknown server issues (we think) which are causing Soft 404/500 errors at unpredictable times on 2 websites. When we check on the pages, they’re fine but still show errors in Moz/Search Console. What are some measures we can take to protect from this or figure out what is causing this? Example URL for Soft 404 Error: https://www.advancedtraveltherapy.com/jobs/any/occupational-therapist/any/ Example URL for 500 Error: https://www.advancedtraveltherapy.com/job-detail/ms/physical-therapist/87529740/ Example URL for Soft 404 Error: https://www.advancedtravelnursing.com/search/searchresults.php?jobState=CA&tempType=g&specialties= Example URL for 500 Error: https://www.advancedtravelnursing.com/job/ma/registered-nurse/emergency-room/87108662/
Technical SEO | | StaffingRobot0 -
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Sudden data lost of links in Webmaster tools & search clicks too decreased to zero
Hi guys please help me to solve this webmaster tools issue and downfall of clicks and impressions. What is the Issue : I have managed hills self storage site since long time, but recently ( since few weeks ) in webmaster what I noticed is that, there is a constant reduction in clicks which reached zero thereafter it has been continuously stuck to zero, however impressions too started decreasing from the same time period and that too reached gradually to zero. What I have tested : I have tested Analytics Traffic which is increasing gradually Keywords ranking in Google Australia is also increasing gradually Before October 2014 Total links are 200+, In mid October 2014 it has started decreasing gradually & now it is showing “No Data Available”. 301 redirection is perfect, google fetching is also ok. What Enhancement we did : Site moved to a new server in October 2014 We switched our site pages from “http” to “https” in October 2014 Kindly reply to my above queries so that I can get back to my “Total Links”, “correct impressions” and “clicks” as previously in the webmaster tools ? Regards, Dave 2eevhoj.jpg zxpro5.jpg
Technical SEO | | akshaydesai0 -
Is there a tool to see all redirects?
I'm thinking this is a silly question, but I've never had to deal with it I thought I'd ask. Ok is there a tool out there that will show all the redirects to a domain. I'm working on a project that I keep stumbling on urls that redirect to the site I'm studying. They don't show up in Open Site or ahrefs as linking domains, but they keep popping up on me. Any thoughts?
Technical SEO | | BCutrer0 -
Links in Webmaster Tools that aren't really linking to us
I've noticed that there is a domain in WMT that Google says is linking to our domain from 173 different pages, but it actually isn't linking to us at all on ANY of those pages. The site is a business directory that seems to be automatically scraping business listings and adding them to hundreds of different categories. Low quality crap that I've disavowed just in case. I have hand checked a bunch of the pages that WMT is reporting with links to us by viewing source, but there's no links to us. I've also used crawlers to check for links, but they turn up nothing. The pages do, however, mention our brand name. I find this very odd that Google would report links to our site when there isn't actually links to our site. Has anyone else ever noticed something like this?
Technical SEO | | Philip-DiPatrizio0 -
Wiki/Knowledge bases
Hi A client of mine is creating a knowledge base/wiki for their website. There using there suppliers own knowledge base (basically their a reseller). What would be the best practice with regards to duplicate content. Would it be best to make all the pages "no follow"? and block the pages by the robot.txt?
Technical SEO | | Cocoonfxmedia0 -
Webmaster woes - should I re-direct or re-structure?
Hey guys, I'll get straight to the point - a small (growing) website I'm working on has a number links pointing to it from totally irrelevant sites (66, to be precise). These were built by an SEO company prior to me working on the site, and lead to an over-optimisation penalty for one keyword. This number doesn't sound large, but proportionally (to all other links), it is. It didn't used to be, but a lot of the links coming in have now 'died', and the domains they came from are now just parked. Anyway, I have managed to contact pretty much all the webmasters, and 27 of these links have been removed. Unfortunately - as I'm sure many people know all too well - a good handful of the contacted webmasters haven't replied, and the bad links still remain on their websites (either in-content or on links pages). I have decided to 'refresh' the website with some new (and better) content - providing much more information and a valuable resource. My question is - what should I do? Should I just replace the content on the existing pages (slightly altering the URL structure to match the topic more) and 301 the old URLs to the new ones? Or should I delete the pages and create new ones - thus making sure this particular section of the site isn't affected by any bad in-bound links? I'm more inclined to opt for the latter option, and 'start fresh' with the pages - so I know I've got total control over them, but wanted to get the opinion of the community before I made a decision. Thanks in advance for your responses! Nick
Technical SEO | | Danapollo0 -
Meta data in includes: not ideal or a problem?
I have pages with meta data being pulled in via an include. This was to prevent people from touching the pages themselves. Is this an optimization issue- or is it OK to do?
Technical SEO | | Tribeca-Marketing-Group0