Deleting 30,000 pages all at once - good idea or bad idea?
-
We have 30,000 pages that we want to get rid of. Each product within our database has it's own page. And these particular 30,000 products are not relevant anymore. They have very little content on them and are basically the same exact page but with a few title changes.
We no longer want them weighing down our database so we are going to delete them.
My question is - should we get rid of them in smaller batches like 2,000 pages at a time, or is it better to get rid of all them in one fell swoop? Which is least likely to raise a flag to Google? Anyone have any experience with this?
-
Hi
This happened to a co. I was working with recently who deleted thousands of pages without notifying the SEO team, this led to lots of work in WMT and lots of digging around to understand where we had lost links. If you've got a load of inbound links pointing at these pages and nothing has been done 301s etc then watch your DA fall, as happened to this company. Ouch. This is what happens when Tech don't "like" marketing.
-
To make it look organic for Google I would say it would depend on how large the site is. If you have a site that only has an additional 30,000 pages, then it would be really odd for half the site to be removed at once. To be safe, do a batch of say 10,000 and let it index before doing the next batch. If it seems like you've been penalized, do a smaller batch the next time.
Then again, if you are already being penalized and are desperate for a change, you are already falling out of control into the black whole of Google SERPs, then do them all at once. After all, your already doing poorly, any corrective action would be better than nothing.
Just my thoughts. There are prolly better experts here than me. lol
-
I would delete all of them at once. BAM!
-
Not a problem. Any advice on deleting all at once or deleting in bits and pieces?
-
In that case, I agree with EGOL. Just drop them all.
-
Sorry, I edited after you posted... I agree... If the 301 is more work then no need to do it.
Good luck.
-
This is more of a theoretical question, but if we're not getting traffic to these pages and there wasn't a way for people to get there, do we need to 301?
Adding 301's will increase the work we'll need to do from the dev standpoint and I'm not sure if it's worth the effort. Any ideas?
-
I would delete these pages as soon as possible, since you have determined that they are not of value. It is possible that all of these pages are dead weight on your site.
Chop chop.
-
Visitors can't actually get to these pages, so it wouldn't be an issue for them. We also have researched and have not had any traffic to any of these pages for over 2 years. So we're not worried about it from a user-facing standpoint.
We're planning on 404ing them because the likelihood of having any backlinks are as close to zero as possible. I'm just wondering if it's better to 404 them in batches or all at once.
-
I suppose the most important question is What will be replacing them?
You don't necessarily want 30k 404 pages appearing overnight like that and causing issues for visitors. Personally I'd say you should go through all the pages to determine what the most relevant alternative page is and then 301 redirect the old pages to the new ones. That'll be a lot of work for 30,000 pages but it would probably be the best way to save from inadvertently losing traffic, backlinks and positive link equity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
Why does Google's search results display my home page instead of my target page?
Why does Google's search results display my home page instead of my target page?
Technical SEO | | h.hedayati6712365410 -
Duplicate Page Content
Hi, I just had my site crawled by the seomoz robot and it came back with some errors. Basically it seems the categories and dates are not crawling directly. I'm a SEO newbie here Below is a capture of the video of what I am talking about. Any ideas on how to fix this? Hkpekchp
Technical SEO | | mcardenal0 -
Does google like Category pages or pages with lots of Products on them?
We are having an issue with getting Google to rank the page we want. To have this page http://www.jakewilson.com/c/52/-/346/Cruiser-Motorcycle-Tires rank for the key word Cruiser Motorcycle Tires; however, this page http://www.jakewilson.com/t/52/-/343/752/Cruiser-Motorcycle-Tires is ranking instead and it has less links and page authority according to site explorer and it is farther down in the hierarchy. I am wondering if google just likes pages that have actual products on them instead of a page leading to the page with all the products. Thoughts?
Technical SEO | | DoRM0 -
Off-page SEO and on-page SEO improvements
I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,
Technical SEO | | fkdpl2420 -
If a permanent redirect is supposed to transfer SEO from the old page to the new page, why has my domain authority been impacted?
For example, we redirected our old domain to a new one (leaving no duplicate content on the old domain) and saw a 40% decrease in domain authority. Isn't a permanent redirect supposed to transfer link authority to the place it is redirecting to? Did I do something wrong?
Technical SEO | | BlueLinkERP0 -
Too many on page links
Hi All, As we all know, having to much links on a page is an obstacle for search engine crawlers in terms of the crawl allowance. My category pages are labeled as pages with to many "one page" links by the SEOmoz crawler. This probably comes from the fact that each product on the category page has multiple links (on the image and model number). Now my question is, would it help to setup a text-link with a clickable area as big as the product area? This means every product gets just one link. Would this help get the crawlers deeper in these pages and distribute the link-juice better? Or is Google smart enough already to figure out that two links to the same product page shouldn't be counted as two? Thanks for your replies guys. Rich
Technical SEO | | Horlogeboetiek0 -
Consolidate page strength
Hi, Our site has a fair amount of related/similiar content that has been historically placed on seperate pages. Unfortuantely this spreads out our page strength across multiple pages. We are looking to combine this content onto one page so that our page strength will be focused in one location (optimized for search). The content is extensive so placing it all on one page isn't ideal from a user experience (better to separate it out). We are looking into different approaches one main "tabbed" page with query string params to seperate the seperate pages. We'll use an AJAX driven design, but for non js browsers, we'll gracefully degrade to separate pages with querystring params. www.xxx.com/content/?pg=1 www.xxx.com/content/?pg=2 www.xxx.com/content/?pg=3 We'd then rel canonical all three pages to just be www.xxx.com/content/ Same concept but useAJAX crawlable hash tag design (!#). Load everything onto one page, but the page could get quite large so latency will increase. I don't think from an SEO perspective there is much difference between options 1 & 2. We'll mostly be relying on Google using the rel canonical tag. Have others dealt with this issue were you have lots of similiar content. From a UX perspective you want to separate/classifiy it, but from an SEO perspective want to consolidate? It really is very similiar content so using a rel canonical makes sense. What have others done? Thoughts?
Technical SEO | | NicB10