What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
-
Hi guys,
We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice)
Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result.
Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get product info into Google Search Result box
Hi, in the last couple of weeks I get more and more search results with a product and prices of retailers below (see sample attached). Are there Schema parameters one could use to have a bigger chance to appear there? Thanks in advance Dieter Lang 0EYJtRJ
Intermediate & Advanced SEO | | Storesco1 -
Many New Urls at once
Hi, I have about 5,000 new URLs to publish. For SEO/Google - Should I publish them gradually, or all at once is fine? *By the way - all these URLs were already indexed in the past, but then redirected. Cheers,
Intermediate & Advanced SEO | | viatrading10 -
Manual Removal Request Versus Automated Request to Remove Bad Links
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google. Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask? I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome. Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
When i search for my domain name - google asks "did you mean" - why?
Hi all, I just noticed something quite odd - if i do a search for my domain name (see: http://goo.gl/LBc1lz) google shows my domain as first result, but it also asks "did i mean" and names another website with very similar name. the other site has far lower PA/DA according to Moz, any ideas why google is doing this? and more inportantly how i could stop it? please advise James
Intermediate & Advanced SEO | | isntworkdull0 -
Can you nofollow a URL?
Hey Moz Community, My questions sounds pretty simple but unfortunately, it isn't. I have a domain name (we'll use example.com for this) http://example.com which 301 re-directs to http://www.example.com. http://example.com has bad links pointing to it and http://www.example.com does not. So essentially, I want to stop negative influences from http://example.com being passed on to http://www.example.com. A 302 re-direct sounds like it would work in theory but is this the best way to go about this? Just so you know, we have completed a reconsideration request a long time ago but I think the bad links are still negatively affecting the website as it does not rank for it's own name which is bizarre. Actual Question: How do I re-direct http://example.com to http://www.example.com without passing on the negative SEO attached to http://example.com? Thanks in advance!
Intermediate & Advanced SEO | | RiceMedia0 -
Will Google View Using Google Translate As Duplicate?
If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?
Intermediate & Advanced SEO | | khi50 -
What's the best way to check Google search results for all pages NOT linking to a domain?
I need to do a bit of link reclamation for some brand terms. From the little bit of searching I've done, there appear to be several thousand pages that meet the criteria, but I can already tell it's going to be impossible or extremely inefficient to save them all manually. Ideally, I need an exported list of all the pages mentioning brand terms not linking to my domain, and then I'll import them into BuzzStream for a link campaign. Anybody have any ideas about how to do that? Thanks! Jon
Intermediate & Advanced SEO | | JonMorrow0 -
Google Said "Repeat the search with the omitted results included."
We have some pages targeting the different countries but with the Near to Similar content/products, just distinguished with the country name etc. one of the page was assigned to me for optimizing. two or three Similar pages are ranked with in top 50 for the main keyword. I updated some on page content to make it more distinguish from others. After some link building, I found that this page still not showing in Google result, even I found the following message on the google. "In order to show you the most relevant results, we have omitted some entries very similar to the 698 already displayed.
Intermediate & Advanced SEO | | alexgray
If you like, you can repeat the search with the omitted results included." I clicked to repeat omitted result and found that my targeted url on 450th place in google (before link building this was not) My questions are Is google consider this page low quality or duplicate content? Is there any role of internal linking to give importance a page on other (when they are near to similar)? Like these pages can hurt the whole site rankings? How to handle this issue?0