Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Search console validation taking a long time?
-
Hello! I did something dumb back in the beginning of September. I updated Yoast and somehow noindexed a whole set of custom taxonomy on my site. I fixed this and then asked Google to validate the fixes on September 20. Since then they have gotten through only 5 of the 64 URLS.....is this normal? Just want to make sure I'm not missing something that I should be doing.
Thank you! ^_^
-
You're welcome.
We as a community are here to help.If your issue is now fixed, you could mark this question as answered
Best luck.
GR -
Cool! Thanks Gaston! I'm glad I asked about this! ^_^
-
What sometimes happens is that when some URLs are marked as noindex, googlebot reduces its crawling frequency as they interpret that you really don't want that page to be indexed thus has value or so ever.
What you just did is tell GoogleBot to crawl specifically that page and "force" it to analyze and render that page. So GoogleBot now understood that the noindex is no longer set and that page should be indexed.
I'd wait a few days so that googlebot naturally crawls all your site again and eventually index every page that deserves to be indexed.If that doesnt happen in about 2 weeks, then there is a tool in the old Search Console, where you can tell GoogleBot to Crawl a single page and its links. That is under Crawl-> Fetch as Google. Request an URL to fetched, after a few minutes it a button: _Request indexing_will appear, there you'll have the option to "Crawl this URL and its direct links". This image might came handy: https://imgur.com/a/y5DbUVw
I'm glad it helped previously and hope the last helps you even more.
Best luck.
GR -
Whoooooooaaaaahhhhhh! that fixed it! what's the deal!? lol. why is this method instantaneous and the other method I was pointed to by google is taking months?....do I have to do this with each individual URL?
-
....or maybe that's what it found the last time it was crawled? I clicked the "request indexing" button.....we'll see what happens.
-
hmmm. it says:
Indexing allowed? No: 'noindex' detected in 'robots' meta tag....but I have the settings in yoast set up to allow indexing.....do you think maybe changes in yoast settings aren't applying retroactively?....
-
Sorry to hear that.
Its possible that googlebot still didnt find out that you've changed noindex tag.
Would you mind checking what does the _Inspect URL _tool report?
To find that, go to the new version of Search Console and enter one of that URL that should be indexed in the textbox.
Then clic on the "test live URL" button. This image could be helpful: https://imgur.com/a/CPvfwifThere you might get a hint of what is going on.
-
They're in google search console, but I have tried searching for a couple of them and they don't appear to be indexed :-(. I tried the method you suggested and that didn't bring up anything either.
-
Hi angela,
Those 5 out of 64 URLs.. Is that a report in Search Console? or only 5 URLs appear when searching in Google?
Search Console usually takes a little longer to update its reports on index status.Have you tried a site: search? Also using _inurl: _parameter.
For example: site:domain.com inurl:/category-noindexed/Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Large Competitor closed, how to capitalize in search. Any ideas?
Hey Mozzers, One of our biggest competitors closed down on January 1st, 2020 in several US cities. They did stay open in some areas just FYI. The competitor's website is www.execucar.com. This is a very large company that has a presence in almost all US major airports. It's a private car service just like Uber but for wealthy individuals. For example. when you search " lax car service" they are #3 on Google or "car service to lax" they're #2 still. What can we do to get more of their traffic and actual business? Has anyone done something like this before or knows quick and easy tactics to get their clients? We have a local landing page: https://dcacar.com/lax-car-service that ranks 9 through 11 for those same keywords. Thanks for your thoughts and time. Davit
Intermediate & Advanced SEO | | Davit19850 -
How to Get Rid of Dates Shown In Google Search Results
When I enter "Site: URL" to check what a search how Google displays search result, a date appears at the very front. This takes away several characters, really valuable real estate. How can I stop Google from displaying these dates? There are certain Wordpress plugins like "WP Date Remover" however the seem to only apply to blog posts. Dates are appearing on results on all my Wordpress pages. Is there an internal setting in Wordpress that will allow me to remove dates for these non blogpost pages?
Intermediate & Advanced SEO | | Kingalan11 -
How old is 404 data from Google Search Console?
I was wondering how old the 404 data from Google Search Console actually is? Does anyone know over what kind of timespan their site 404s data is compiled over? How long do the 404s tend to take to disappear from the Google Search Console, once they are fixed?
Intermediate & Advanced SEO | | McTaggart0 -
Which search engines should we submit our sitemap to?
Other than Google and Bing, which search engines should we submit our sitemap to?
Intermediate & Advanced SEO | | NicheSocial0 -
Redirecting to a new domain... a second time
Hi all, I help run a website for a history-themed podcast and we just moved it to its second domain in 7 years. We've had very good SEO up until last week, and I'm wondering if I screwed up the way I redirected the domains. It's like this: Originally the site was hosted at "first.com", and it acquired inbound links. However, we then started to host the site on blogger, so we... Redirected the site to "second.blogspot.com". (Thus, 1 --> 2) It stayed here for about 7 years and got lots of traffic. Two weeks ago we moved it off of blogger and into Wordpress, so we 301 redirected everything to... third.com. (Thus, 1 --> 2 --> 3) The redirects worked, and when we Google individual posts, we are now seeing them in Google's index at the new URL. My question: What about the 1--> 2 redirect? There are still lots of links pointing to "first.com". Last week I went into my GoDaddy settings and changed the first redirect, so that first.com now points to third.com. (Thus 1 --> 3, and 2-->3) I was correct in doing that, right? The drop in Google traffic I've seen this past week makes me think that maybe I screwed something up. Should we have kept 1 --> 2 --> 3? (Again, now we have 1-->3 and 2-->3) Thanks for any insights on this! Tom
Intermediate & Advanced SEO | | TomNYC1 -
For how long does Google honor a 302 redirect?
Greetings! I would love some recent experiences to support our experience which is +/- 1 year old on this question. Based on our experiences around a year ago, I believe that Google will only honor a 302 temporary redirect for a relatively short period - perhaps up to a month - and then it will begin treating the redirect as a 301 redirect and will remove the old page from the index. Have others seen this? Is there an update on what the max "safe" period to have a 302 in place could be? We have a domain that is soon to experience about 3 months of "downtime" with no content on it, but the content will be back after that time. Ideally we would 302 redirect the pages elsewhere just for that downtime period. However, I don't want to do a 302 redirect if there is a risk that the pages will lose all of their accumulated authority and indexing. Basically, is there any safe way to just put the domain on ice for a few months? Please share recent experience only. Thanks for your insights!
Intermediate & Advanced SEO | | g-s-m0 -
Organic search traffic dropped 40% - what am I missing?
Have a client (ecommerce site with 1,000+ pages) who recently switched to OpenCart from another cart. Their organic search traffic (from Google, Yahoo, and Bing) dropped roughly 40%. Unfortunately, we weren't involved with the site before, so we can only rely on the wayback machine to compare previous to present. I've checked all the common causes of traffic drops and so far I mostly know what's probably not causing the issue. Any suggestions? Some URLs are the same and the rest 301 redirect (note that many of the pages were 404 until a couple weeks after the switch when the client implemented more 301 redirects) They've got an XML sitemap and are well-indexed. The traffic drops hit pretty much across the site, they are not specific to a few pages. The traffic drops are not specific to any one country or language. Traffic drops hit mobile, tablet, and desktop I've done a full site crawl, only 1 404 page and no other significant issues. Site crawl didn't find any pages blocked by nofollow, no index, robots.txt Canonical URLs are good Site has about 20K pages indexed They have some bad backlinks, but I don't think it's backlink-related because Google, Yahoo, and Bing have all dropped. I'm comparing on-page optimization for select pages before and after, and not finding a lot of differences. It does appear that they implemented Schema.org when they launched the new site. Page load speed is good I feel there must be a pretty basic issue here for Google, Yahoo, and Bing to all drop off, but so far I haven't found it. What am I missing?
Intermediate & Advanced SEO | | AdamThompson0 -
Ranking for local searches without city specific keywords?
Hey guys! I had asked this question a few months ago and now that we are seeing even more implicit information determining search results, I want to ask it again..in two parts. Is is STILL best practice for on-page to add the city name to your titles, h1s, content etc? It seems that this will eventually be an outdated tactic, right? If there is a decent amount of search volume without any city name in the search query (ie. "storefont signs", but no search volume for the phrase when specific cities are added (ie. "storefront signs west palm beach) is it worth trying to rank and optimize for that search term for a company in West Palm Beach? We can assume that if there are 20,000 monthly searches for the non-location specific term that SOME of them would be fairly local, so do we optimize the page without the city name and trust Google to display results with a local intent...therefore showing our client's site in the SERPS when someone searches "sign company" and they are IN West Palm Beach? If there is any confusion, please just ask me to clarify! I think this would be a great WhiteBoard Friday topic for Rand!
Intermediate & Advanced SEO | | RickyShockley0