404 broken URLs coming up in Google
-
When we do a search for our brand, we are get the following results in google.com.au (see image attachment).
As outlined in red, there are listings in Google that result in 404 Page Not Found URLs.
What can we do to enable google to do a recrawl or to ensure that these broken URLs are no longer listed in Google?
Thanks for your help here!
-
Apologies for the delay in responses here. Thanks Andreas and Mike. We ended up doing just that and redirected 404 errors before doing a recrawl. Worked great! Thanks for your help.
-
Agreed. Go to Search Console, see what 404 errors Google is throwing your way, 301 redirect anything that can & should be redirected from the list to their most relevant equivalent on the live site, and then fetch & submit the site for a recrawl.
OR (since the links in question you posted was for a Test Site) if that test version needs to be up for internal testing purposes then you can potentially NoIndex the pages, resubmit for crawl so the bots see the NoIndex on the pages, and then after they've dropped out of the SERPs you can update your robots.txt to disallow the folder those pages are sitting on. (Not sure if there's a better/quicker way to get them out of the SERPs if you still need it Live)
-
You may redirect them 301 to the page wich is now the best answer for the search query. That fixes the problem for the user immediatly.
You can than go to search console and let the bot crawl the url again (fetch as google) - he will see the redirect, you can send it to google.
Make sure your redirection is the best answer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Www. or naked url?
Hi everyone, I am about to start a new WordPress site and debating whether to use www or naked URL for the URL structure. Using naked URL makes sense from a branding and minimalistic perspective but I am reading that using naked URL might have some technical deficiencies. Specifically, cookie issues and DNS can't be cname. Are these technical deficiencies still valid when using naked url? Would appreciate any feedback on this! Cheers
Intermediate & Advanced SEO | | nsereke1 -
Productontology URLs are 404 erroring, are there alternatives to denote new schema categories?
Our team QA specialist recently noticing that the class identifier URLs via productontology are 404ing out saying that the "There is no Wikipedia article for (particular property)". They are even 404ing for productontology URLs that are examples on the productontology.com website! Example: http://www.productontology.org/id/Apple The 404 page says that the wiki entry for "Apple" doesn't exist (lol) Does anybody know what is going on with this website? This service was extremely helpful for creating additionalType categories for schema categories that don't exist on schema.org. Are there any alternatives to productontology now that these class identifier URLs are 404ing? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
April Google Update?
Since April 16 (when Jews ate Matzah) Google hurt one of our clients badly. They are well-known and beloved brand with hundreds of employees and locations across USA.
Intermediate & Advanced SEO | | Elchanan
I can’t see any signal of organic update, or penalty (neither Google Places). No message on GWT Nothing has been changed on and off site. All keywords' ranking are looking like this All tools showing good analysis: MOZ, Barracuda, MajesticSeo Content is good and not duplicated, etc. Do one of you is aware of significant Google update?
What do you think/suggest?0 -
Expired urls
For a large jobs site, what would be the best way to handle job adverts that are no longer available? Ideas that I have include: Keep the url live with the original content and display current similar job vacancies below - this has the advantage of continually growing the number of indexed pages. 301 redirect old pages to parent categories - this has the advantage of concentrating any acquired link juice where it is most needed. Your thoughts much appreciated.
Intermediate & Advanced SEO | | cottamg0 -
URL with a # but no ! being indexed
Given that it contains a #, how come Google is able to index this URL?: http://www.rtl.nl/xl/#/home It was my understanding that Google can't handle # properly unless it's paired with a ! (hash fragment / bang). site:http://www.rtl.nl/xl/#/home returns nothing, but: site:http://www.rtl.nl/xl returns http://www.rtl.nl/xl/#/home in the result set
Intermediate & Advanced SEO | | EdelmanDigital0 -
Merging your google places page with google plus page.
I have a map listing showing for the keyword junk cars for cash nj. I recently created a new g+ page and requested a merge between the places and the + page. now when you do a search you see the following. Junk Cars For Cash NJ LLC
Intermediate & Advanced SEO | | junkcars
junkcarforcashnj.com/
Google+ page - Google+ page the first hyperlink takes me to the about page of the G+ and the second link takes me to the posts section within g+. Is this normal? should i delete the places account where the listing was originally created? Or do i leave it as is? Thanks0 -
Sitemaps recommend by google
Google in it guideline recommends to create a sitemap. Do they means a /sitemap.xml or does it need to be sitemap directly on the website ? Does it make any difference ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0