Removing a large number of unnecessary pages from a site
-
Hi all,
I got a big problem with my website. I have a lot of page, duplicate page made from various combinations of selects, and for all this duplicate content we've be hit by a panda update 2 years ago.
I don't want to bring new content an all of these pages, about 3.000.000, because most of them are unnecessary. Google indexed all of them (3.000.000), and I want to redirect the pages that I don't need anymore to the most important ones.
My question, is there any problem in how google will see this change, because after this it will remain only 5000-6000 relevant pages?
-
I can't see this causing you problems. I've commonly noindexed huge numbers of pages, mostly for sites with Panda issues, and in several cases we've seen great increases in traffic with a future Panda refresh.
-
No problem with doing this at all. I have also worked with a client who had a large number of pages and they came down from about 170,000 to just under 2,000 and the result was a good one. This was also to combat Panda.
However, when redirecting, take care not to just redirect every one of the 3m pages to the same places. If you can't find a good and relevant match for a 301, then just 404 it. it is much better to have a 404 than poor redirects. I would also review your 404 page to make sure it is user friendly and helps people navigate to something better.
Best of luck.
-Andy
-
Ouch, Google will definitely notice. But if you already took a big hit on traffic then this probably would be a wise choice. What i would check before is how much traffic is going to the pages that you've deleted so you know upfront what kind of traffic you could lose again by removing these pages.
With luck though the pages that will remain in place will have more authority as they get better links from across the domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
Main Site and eCommerce Site URLs for SEO
My client currently has a main website on a url and an eCommerce site on a subdomain. The eCommerce site is currently not mobile friendly, has images that are too small and are problematic - and I believe it negates some of the SEO work we do for them. I had to turn off Google Shopping ads because the quality score was so low. That being said, they are rebuilding a shopping cart on a new platform that will be mobile friendly BUT the images are going to be tiny until they slowly replace images over several months. Would you keep the shopping cart on a subdomain, or make it part of the main website URL? Can it negatively impact the progress we have made on the main site SEO.
Technical SEO | | jerrico10 -
Duplicate page titles for blog snippets pages
I can't figure the answer to this issue, on my blog I have a number of pages which each show snippets and an image for each blog entry, these are called /recent-weddings/page/1 /2 /3 and so on. I'm getting duplicate page titles for these but can't find anywhere on Wordpress to set a unique title for them. So http://www.weddingphotojournalist.co.uk/recent-weddings/…/2/ has the same title as http://www.weddingphotojournalist.co.uk/recent-weddings/…/3/
Technical SEO | | simonatkinsphoto0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
I have a mobile version and a standard version of my website. I'd like to show users some pages on the non-mobile site but keep googlebot mobile out. Is that ok?
On the mobile version not all the content of the normal site is available to the users. Since we didn't want googlebot mobile to index the non-mobile site, all the non-existent pages were returned with a 404 error. But now we'd like to show the mobile users these pages and send them to the normal site. If we allow the users to see these pages, is it ok to block googlebot mobile so these non-mobile pages are not indexed by googlebot mobile or will that create some issues for google?
Technical SEO | | bgs0 -
Please recommend a tool to list pages on my site.
I have taken a major hit from the latest update. Site has been online for 10 years, white hat SEO all the way but I do have some legacy pages were I would duplicate title or the description on a new page. Things are just unorganized currently and trying to find the best approach to organizing what I already have as well as track new content. I would like to have a tool that would basically extract a list of my current pages, the title tags and the description in an Excel file. Not sure how the pros organinze the SEO on a site but my biright idea is that I can have a large excel file with the pages listed so I can detect duplicate info. Site only has about 300 pages. Just regular php pages, no CMS. Thanks in advance!
Technical SEO | | Force70 -
Redirect from old wordpress site to new php site? Best approach
Hi I have two websites one legacy site done in wordpress the other in php. However I would like to merge the two together and remove the wordpress site. However it has a good link profile and the pages rank well. What is the best approach to do a 301 redirect from the old site with all its pages pointing to the homepage of the new site? If so what's the best way to do this in wordpress? Many thanks
Technical SEO | | ocelot0 -
Non-www home page indexed, but www for rest of site
Hi there, grateful for any ideas on why this is happening: http://www.google.co.uk/search?q=site:www.vitispr.com vs http://www.google.co.uk/search?q=site:vitispr.com Google seems to be indexing and caching vitispr.com for our home page but the www. versions for everything else. As you can see the second query finds the home page. Any ideas why that might be? Other info that might be relevant: non-www etc. are all 301'd to www versions. moved domains/urls etc. around in March of this year and for a week or we were redirecting to the non-www version webmaster tools says 'www' preferred Thanks!
Technical SEO | | JaspalX0