Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Homepage was removed from google and got deranked
-
Hello experts
I have a problem. The main page of my homepage got deranked severely and now I am not sure how to get the rank back.
It started when I accidentally canonicalized the main page "https://kv16.dk" to a page that did not exist.
4 months later the page got deranked, and you were not able to see the "main page" in the search results at all, not even when searching for "kv16.dk".
Then we discovered the canonicalization mistake and fixed it, and were able to get the main page back in the search results when searching for "kv16.dk".
At first after we made the correction, some weeks passed by, and the ranking didn't get better. Google search console recommended uploading a sitemap, do we did that. However in this sitemap there was a lot of "thin content sites", for all the wordpress attachments. E.g. for every image in an article. more exactly there were 91 of these attachment sites, and the rest of the page consists of only two pages "main page" and an extra landing page.
After that google begun recommending the attachment urls in some searches. We tried fixing it by redirecting all the attachments to their simple form. E.g. if it was an attachment page for an image we redirected strait to the image.
Google has not yet removed these attachment pages, so the question is if you think it will help to remove the attachments via google search console, or will that not help at all?
For example when we search "kv16" an attachment URL named "birksø" is one of the first results
-
Hi Everett
First of all I am sorry for the late reply, I was vacationing the last 7 days.
Thank you for your reply. I think you might be right about the "sandbox" thing. The page had a good position in the google search results, but then we made a mistake and canonicalized it to a non-existent page for 4 months. It could be that google considers it a "new page", even though they had it indexed for a year.
I appreciate your efforts, and I will wait some time, to see if it improves by itself, otherwise I will have to work some more on improving the contents of the site.
-
Hi Christian,
I don't see any evidence of the site being deindexed now. Here are some things I checked for you, along with a few observations:
-
Nothing in the Robots.txt file, or robots meta tag, or X-robots HTTP header response that would keep these pages from being indexed by Google
-
The rel= canonical tags appear to be functioning properly
-
The home page is indexed and not duplicated by other indexed pages
-
Google has about 86 pages indexd from your domain
-
Hrefl Lang tags appear to be implemented properly
-
There are only about 50 links going into the domain from other sites, and the ones from Moz are the best of what few aren't just random scraper sites (harmless, but annoying).
Sometimes Google ranks a brand higher when it first comes out because it's a chicken or egg situation. How else can they collect data for their machine to chew on unless some traffic is sent to a new site? We used to call this phenomenon "the Google sandbox" a long time ago, but it is essentially (in its effect) the same thing. We do it ourselves with A:B testing and paid advertising. You have to spend some budget to gain enough data to know what's working and what isn't.
I don't think you have a technical SEO problem here. I think you need to continue building a brand and producing useful, rich content. Good luck!
-
-
Hi Ross
Thank you a lot for all the recommendations, I will get those things done and get back on here with the result.
Though currently I have the Yoast SEO plugin, and have generated a sitemap with only the pages we want ranking, which I did already upload to GSC, but I will make sure there is a link in the footer as well.
And also I did a 301 redirect on all these "attachment" pages, but I will change that to a 410.
-
Hi Christian,
Try to update your home page by adding additional content to it or rewriting the existing content on the home page. Do not forget to reindex the home page after your update. In addition, you should install the Rank Math SEO plugin and regenerate the sitemap. Once, you have a new sitemap then you should resubmit the sitemap to GSC. You should only keep the URLs in the sitemap, but no images. If you want to remove some of your pages from the index you should set those pages to 410 Gone instead of 404 status. Also, I do not see that you have a link to your sitemap on your home page. You should add a link to the sitemap into your footer.
Ross
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | Nov 29, 2018, 10:36 AM | Tom3_150 -
How to inform Google to remove 404 Pages of my website?
Hi, I want to remove more than 6,000 pages of my website because of bad keywords, I am going to drop all these pages and making them ‘404’ I want to know how can I inform google that these pages does not exists so please don’t send me traffic from those bad keywords? Also want to know can I use disavow tool of google website to exclude these 6,000 pages of my own website?
Technical SEO | Sep 8, 2014, 1:09 PM | renukishor4 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | Jul 9, 2014, 6:27 AM | sparrowdog1 -
Google not pulling my favicon
Several sites use Google favicon to load favicons instead of loading it from the Website itself. Our favicon is not being pulled from our site correctly, instead it shows the default "world" image. https://plus.google.com/_/favicon?domain=www.example.com Is the address to pull a favicon. When I post on G+ or see other sites that use that service to pull favicons ours isn't displaying, despite it shows up in Chrome, Firefox, IE, etc and we have the correct meta in all pages of our site. Any idea why is this happening? Or how to "ping" Google to update that?
Technical SEO | May 5, 2013, 7:44 PM | FedeEinhorn0 -
Notice of DMCA removal from Google Search
Dear Mozer's Today I get from Google Webmaster tools a "Notice of DMCA removal" I'll paste here the note to get your opinions "Hello, Google has been notified, according to the terms of the Digital Millennium Copyright Act (DMCA), that some of your materials allegedly infringe upon the copyrights of others. The URLs of the allegedly infringing materials may be found at the end of this message. The affected URLs are listed below: http://www.freesharewaredepot.com/productpages/Ultimate_Spelling__038119.asp" So I perform these steps: 1. Remove the page from the site (now it gives 404). 2. Remove it from database (no listed on directory, sitemap.xml and RSS) 3. I fill the "Google Content Removed Notification form" detailing the removal of the page. My question is now I have to do any other task, such as fill a site reconsideration, or only I have to wait. Thank you for your help. Claudio
Technical SEO | Sep 3, 2012, 12:08 AM | SharewarePros0 -
Tags showing up in Google
Yesterday a user pointed out to me that Tags were being indexed in Google search results and that was not a good idea. I went into my Yoast settings and checked the "nofollow, index" in my Taxanomies, but when checking the source code for no follow, I found nothing. So instead, I went into the robot.txt and disallowed /tag/ Is that ok? or is that a bad idea? The site is The Tech Block for anyone interested in looking.
Technical SEO | Jul 25, 2012, 5:58 PM | ttb0 -
Does Google Bot accept Cookies
I am working with a per page results refinement that stores a cookie on the users computer and then keeps that same per page as the user goes around the site. I was just wondering if that was true for Google bot or Bing bot as well. Will they keep the cookie or would they not be able to accept it. I just want to know as I dont want different urls created if they can keep the cookie. Thanks!
Technical SEO | Apr 18, 2012, 10:47 PM | Gordian0 -
Why has Google removed meta descriptions from SERPS?
One of my clients' sites has just been redesigned with lots of new URLs added. So the 301 redirections have been put in place and most of the new URLs have now been indexed. BUT Google is still showing all the old URLs in the SERPS and even worse it only displays the title tag. The meta description is not shown, no rich snippet, no text, nothing below the title. This is proving disastrous as visitors are not clicking on a result with no description. I have to assume its got something to do with the redirection, but why is it not showing the descriptions? I've checked the old URLs and he meta description is definitely still in the code, but Google is choosing not to show it. I've never seen this before so I'm struggling for an answer. I'd like to know why or how this is happening, and if it can be resolved. I realise that this may be resolved when Google stops showing all the old URLs but there's no telling how long that will take (can it be speeded up?)
Technical SEO | Feb 6, 2012, 3:56 PM | Websensejim0