Getting rid of low quality
-
If I wanted to get rid of a batch of low quality pages from the index, Is the best practise to let them 404 and remove them from sitemap files?
Thanks
-
Thanks, Wayne, I never thought about link juice flowing to those pages, I'll have to check that out before making a decision. All the pages I want to remove are in the same directory, so would adding the text below to robots.txt remove all the pages in that directory from the index?
User-agent: * Disallow: /directory/
-
Hi Peter,
Great question considering the latest Panda update. A lot of people have been scrambling to remove content that Google might deem "shallow" or of no value to users. We implemented a couple of practices to see which worked best with regard to moving content:
A: We simply added a 'robots.txt' command. This is designed to not allow Google crawl the content.
B: If you have the luxury of moving it to an entirely different domain, that could also be a choice. We found this to be the better of the two in terms of aesthetics. We simply didn't want to gunk up our site with a lot of "shallow" content. It also seemed that the engines responded better to this approach.
Your 404 is another option if you simply want to remove it from the indexes. However, I'd be sure to check that no link juice is flowing through the pages. If so, then a 301 re-direct might be appropriate. Depending on your intentions, each of the three could serve your purpose!
Let me know if I've confused you, or if you need additional opinion!
Best of luck
W
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm struggling to understand (and fix) why I'm getting a 404 error. The URL includes this "%5Bnull%20id=43484%5D" but I cannot find that anywhere in the referring URL. Does anyone know why please? Thanks
Can you help with how to fix this 404 error please? It appears that I have a redirect from one page to the other, although the referring page URL works, but it appears to be linking to another URL with this code at the end of the the URL - %5Bnull%20id=43484%5D that I'm struggling to find and fix. Thanks
Technical SEO | | Nichole.wynter20200 -
If I get spammy backlinks removed is it still necessary to disavow?
Now there is some conflicting beliefs here and I want to know what you think. If I got a high spam website to remove my backlink, is a disavow through search console still necessary ? Keep in mind if it helps even in the slightest to improve rankings im for it!
Technical SEO | | Colemckeon1 -
Getting Errors On Server Connectivity-??
Hi Guys I am getting a massive crawl errors on googlewebmaster ,stating there is over 2162 errors connect time out - anyone know where I can see exactly where the time out is from? I have browsed through my site and I do not see any connect timeout occured. Thanks Cary
Technical SEO | | ilovebodykits1 -
Help! Getting 5XX error
Keep getting a 5XX error and my site is obviously losing ranking, Asked the hoster. Nobody seems to know what is wrong. Site is www.monteverdetours.com I know this is probably an obvious problem and easy to fix but I don't know how to do it! Any comments will be greatly appreciated.
Technical SEO | | Llanero0 -
Do Seomozers recommend sitemaps.xml or not. I'm thoroughly confused now. The more I read, the more conflicted I get
I realize I'm probably opening a can of worms, but here we go. Do you or do you not add a sitemap.xml to a clients site?
Technical SEO | | catherine-2793880 -
Why the number of crawled pages is so low¿?
Hi, my website is www.theprinterdepo.com and I have been in seomoz pro for 2 months. When it started it crawled 10000 pages, then I modified robots.txt to disallow some specific parameters in the pages to be crawled. We have about 3500 products, so thhe number of crawled pages should be close to that number In the last crawl, it shows only 1700, What should I do?
Technical SEO | | levalencia10 -
404 Errors - How to get rid of them?
Hi, I am starting an SEO job on an academic site that has been completely redone. The SEOMoz crawl detected three 404 Errors to pages that cannot be found anywhere on either Joomla or the server. What can I do to solve this? Thanks!!
Technical SEO | | michalseo0 -
How can I get unimportant pages out of Google?
Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon
Technical SEO | | DennisForte0