Deleting 30,000 pages all at once - good idea or bad idea?
-
We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes.
We no longer want them weighing down our database so we are going to delete them.
My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?
-
Hi
This happened to a co. I was working with recently who deleted thousands of pages without notifying the SEO team, this led to lots of work in WMT and lots of digging around to understand where we had lost links. If you've got a load of inbound links pointing at these pages and nothing has been done 301s etc then watch your DA fall, as happened to this company. Ouch. This is what happens when Tech don't "like" marketing.
-
To make it look organic for Google I would say it would depend on how large the site is. If you have a site that only has an additional 30,000 pages, then it would be really odd for half the site to be removed at once. To be safe, do a batch of say 10,000 and let it index before doing the next batch. If it seems like you've been penalized, do a smaller batch the next time.
Then again, if you are already being penalized and are desperate for a change, you are already falling out of control into the black whole of Google SERPs, then do them all at once. After all, your already doing poorly, any corrective action would be better than nothing.
Just my thoughts. There are prolly better experts here than me. lol
-
I would delete all of them at once. BAM!
-
Not a problem. Any advice on deleting all at once or deleting in bits and pieces?
-
In that case, I agree with EGOL. Just drop them all.
-
Sorry, I edited after you posted... I agree... If the 301 is more work then no need to do it.
Good luck.
-
This is more of a theoretical question, but if we're not getting traffic to these pages and there wasn't a way for people to get there, do we need to 301?
Adding 301's will increase the work we'll need to do from the dev standpoint and I'm not sure if it's worth the effort. Any ideas?
-
I would delete these pages as soon as possible, since you have determined that they are not of value. It is possible that all of these pages are dead weight on your site.
Chop chop.
-
Visitors can't actually get to these pages, so it wouldn't be an issue for them. We also have researched and have not had any traffic to any of these pages for over 2 years. So we're not worried about it from a user-facing standpoint.
We're planning on 404ing them because the likelihood of having any backlinks are as close to zero as possible. I'm just wondering if it's better to 404 them in batches or all at once.
-
I suppose the most important question is What will be replacing them?
You don't necessarily want 30k 404 pages appearing overnight like that and causing issues for visitors. Personally I'd say you should go through all the pages to determine what the most relevant alternative page is and then 301 redirect the old pages to the new ones. That'll be a lot of work for 30,000 pages but it would probably be the best way to save from inadvertently losing traffic, backlinks and positive link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page content not being recognised?
I moved my website from Wix to Wordpress in May 2018. Since then, it's disappeared from Google searches. The site and pages are indexed, but no longer ranking. I've just started a Moz campaign, and most pages are being flagged as having "thin content" (50 words or less), when I know that there are 300+ words on most of the pages. Looking at the page source I find this bit of code: page contents Does this mean that Google is finding this and thinks that I have only two words (page contents) on the page? Or is this code to grab the page contents from somewhere else in the code? I'm completely lost with this and would appreciate any insight.
Technical SEO | | Photowife1 -
Canonicalisation and Dynamic Pages
We have an e-commerce single page app hosted at https://www.whichledlight.com and part of this site is our search results page (http://www.whichledlight.com/t/gu10-led-bulbs?fitting_eq=GU10). To narrow down products on the results we make heavy use of query parameters. From an SEO perspective we are telling GoogleBot to not index pages that include these query parameters to prevent duplicate content issues and to not index pages where the combination of query parameters has resulted in no results being returned. The only exception to this is the page parameter. We are posting here to check our homework so to speak. Does the above sound sensible? Although we have told GoogleBot to not index these pages, Moz will still crawl them (to the best of my knowledge), so we will continue to see crawl errors within our Moz reports where in fact these issues don't exist. Is this true? Is there anyway to make Moz ignore pages with certain query parameters? Any other suggestions to improve the SEO of our results pages is most appreciated. Thanks
Technical SEO | | TrueluxGroup0 -
On-Page Problem
Hello Mozzers, A friend has a business website and the on-page stuff is done really bad. He wants to rank for: conference room furnishing, video conference, digital signage. (Don't worry about the keywords, it's just made up for an example.) For these three services he has a page: hiswebsite.com/av AV stands for audio and video and is the h1. If you click on one of the service, the url doesn't change. Like if you click on video conference, just the text changes, the url stays /av. All his targeted pages got an F Grade, I am not surprised, the services titles are in . Wouldn't it be a lot better to make an own page for every service with a targeted keyword, like hiswebsite.com/video-conference All this stuff is on /av, how will a 301 resirect work to all the service pages, does this make sense? Any help is appreciated! Thanks in advance!
Technical SEO | | grobro1 -
Joomla creating duplicate pages, then the duplicate page's canonical points to itself - help!
Using Joomla, every time I create an article a subsequent duplicate page is create, such as: /latest-news/218-image-stabilization-task-used-to-develop-robot-brain-interface and /component/content/article?id=218:image-stabilization-task-used-to-develop-robot-brain-interface The latter being the duplicate. This wouldn't be too much of a problem, but the canonical tag on the duplicate is pointing to itself.. creating mayhem in Moz and Webmaster tools. We have hundreds of duplicates across our website and I'm very concerned with the impact this is having on our SEO! I've tried plugins such as sh404SEF and Styleware extensions, however to no avail. Can anyone help or know of any plugins to fix the canonicals?
Technical SEO | | JamesPearce0 -
What is Too Many On-Page Links?
in campaigns i see " Too Many On-Page Links " what is this ? can anyone please tell me ?
Technical SEO | | constructionhelpline0 -
How to determine which pages are not indexed
Is there a way to determine which pages of a website are not being indexed by the search engines? I know Google Webmasters has a sitemap area where it tells you how many urls have been submitted and how many are indexed out of those submitted. However, it doesn't necessarily show which urls aren't being indexed.
Technical SEO | | priceseo1 -
Page MozRank and MozTrust 0 for Home Page, Makes No Sense?
Hey Mozzers! I'm a bit confused by a site that is showing a 0 for home page MozRank and MozTrust, while its subdomain and root domain metrics look decent (relatively). I am posting images of the page metrics and subdomain metrics to show the disparity: http://i.imgur.com/3i0jq.png http://i.imgur.com/ydfme.png Is it normal to see this type of disparity? The home page has very little inbound links, but the big goose egg has me wondering if there is something else going on. Has anyone else experienced this? Or, does anyone have speculation as to why a home page would have a 0 MozRank while the subdomain metrics look much better? Thanks!
Technical SEO | | ClarityVentures0