Indexed Link Removal request in GWT, good idea?
-
Hello,
I used a plugin to no-index a lot of pages on my website, and its been couple months and they never disappeared from serps, so i used the google webmaster tool "remove urls" to ask google to remove them. Is that a good idea? Or does it look bad in googles eyes? any thoughts would help a lot.
Thanks
-
Hello Vlad,
You are using the tool correctly - as Mat said, this is what the tool is for.
There seems to be something of a consensus that removing an entire link profile (due to extended poor linking processes) is a red flag and might be negatively assessed. That being said, if you are removing a couple bad pages from your profile in order to provide a better experience (or as a response to negative SEO), this is in line with "best practices" and will not be penalized. If you have already gone to the trouble of nofollow-ing them, you should also have them removed as they might hurt you in the future if they are not addressed.
Of course, you could stick with the nofollow for now and see if it impacts your rankings negatively - if so, move to the removal. If not, there's no risk in doing nothing.
Cheers!
Rob
-
If you have set them up for removal and no longer want them in the index then go for it. That is what the tool is for.
There was some suggestion from John Mueller some time ago that lots of removals could be seen as an issue, but it seems that is an exception.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are my pages de-indexed?
<form id="form-t3_37nfib9dz" class="usertext" action="http://www.reddit.com/r/SEO/comments/37nfib/why_were_my_pages_deindexed/#"> Hello all, I am very new to SEO. For some reason many of the pages on my site were de-indexed. Specifically the ones linked from this page: However other pages, like the ones linked from this page and this page were not de-indexed. http://www.lawyerconnection.ca/practice-areas/car-accident-injury-lawyers/[1] However the pages linked from this page were not de-indexed: http://www.lawyerconnection.ca/practice-areas/slip-and-fall-lawyers/[2] http://www.lawyerconnection.ca/podcastresources/[3] That first page itself was not de-indexed, just the site that it links to. It just happened today, so maybe I am jumping the gun but I doubt it. When I enter the page into google webmaster tools again and press fetch, one of the child pages, it re-indexes. What could be the problem here? I had someone re-write the content for every city but I have a feeling that there is less differences in the car accidents pages? Is this considered duplicated content do you think? Am I making some other mistake I can't think of? Is it just a one day blip (I doubt it) Let me know, thanks. </form>
On-Page Optimization | | RafeTLouis0 -
What to do with removed pages and 404 error
I recently removed about 600 'thin' pages from my site which are now showing as 404 errors in WMT as expected. As I understand it I should just let these pages 404 and eventually they'll be dropped from the index. There are no inbound links pointing at them so I don't need to 301 them. They keep appearing in WMT as 404's though so should I just 'mark as fixed' until they stop appearing? Is there any other action I need to take?
On-Page Optimization | | SamCUK0 -
Locating broken links on site?
Hey guys, I'm using Screaming Frog to help locate some broken links on a client's site and I've managed to pick up two. However, I can't seem to find whereabouts they're located on the site in order to fix them! Is there a way I can do this? Cheers!
On-Page Optimization | | Webrevolve0 -
Issues with Product Pages Getting Index In Google
I just started working here the other week and one of the big issue is that a lot of the product pages are not getting index in google. We have an xml.gz site map they submitted a long time ago. My guess is it might be something with not enough content on the pages? Here are a few example of pages that are not getting index in google. http://www.rockymountainatvmc.com/p/43/-/439/716/-/33097/Alpinestars-Dual-Motorcycle-Gloves http://www.rockymountainatvmc.com/p/47/-/201/803/-/28948/Camelbak-Blowfish-2013 http://www.rockymountainatvmc.com/p/46/-/203/836/-/6996/MSR-Head-Case http://www.rockymountainatvmc.com/p/44/54/208/764/80/1220/Galfer-Brake-Pad-Sintered-Metal There are 100's that are not indexed just trying to figure out what we need to do! We are working on new content to them all but we have over 5000 products so it will take a long time. We also have the reviews on the pages and are looking at starting a Q&A on page to help get more unique content.
On-Page Optimization | | DoRM0 -
Blocking Google seeing outbound links?
Apart from rewriting the outbound url to look like a folder 'abc.co.uk/out/link1' and blocking the folder 'out' in the robots.txt file, along with also nofollowing the links as well, is there anything else you can do?
On-Page Optimization | | activitysuper0 -
Too Many Links Explode Upon Upgrade
We upgraded our CMS system in September and then had an explosion of new errors appearing in SEOMOZ. Most concerning was the too many links area. Our main site is www.thenorrisgroup.com. Many of the links on the page are set up as no follows and nothing else changed so I don't understand why it's tracking differently all of a sudden. Any ideas?
On-Page Optimization | | thenorrisgroup0 -
Time taken for inclusion in index
After a page is crawled, how much time does it take to be included in Google's index ? Immediately ? after few days ?
On-Page Optimization | | seoug_20050 -
Tag clouds: good for internal linking and increase of keyword relevant pages?
As Matt Cutts explained, tag clouds are OK if you're not engaged in keyword stuffing (http://www.youtube.com/watch?v=bYPX_ZmhLqg) - i.e. if you're not putting in 500 tags. I'm currently creating tags for an online-bookseller; just like Amazon this e-commerce-site has potentially a couple of million books. Tag clouds will be added to each book detail page in order to enrich each of these pages with relevant keywords both for search engines and users (get a quick overview over the main topics of the book; navigate the site and find other books associated with each tag). Each of these book-specific tag clouds will hold up to 50 tags max, typically rather in the range of up to 10-20. From an SEO perspective, my question is twofold: 1. Does the site benefit from these tag clouds by improving the internal linking structure? 2. Does the site benefit from creating lots of additional tag-specific-pages (up to 200k different tags) or can these pages become a problem, as they don't contain a lot of rich content as such but rather lists of books associated with each tag? Thanks in advance!
On-Page Optimization | | semantopic0