Impact of simplifying website and removing 80% of site's content
-
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited.
The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability.
Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
-
I have to agree with you on making this move. Content that doesn't contribute to the quality of your site and receives minimal traffic should be removed. Besides ensuring the redirects are set properly, you can evaluate if these old content do actually make good material for future writing. It would be a waste to just delete them without any second thoughts. Some snippets of these old content can still prove useful and be spinned into new articles once you elaborate on them.
-
Great answers guys - thanks. It's good to know that my gut feeling was close to the mark!
-
Quality over quantity is definitely the order of the day, but before you drop some content completely, take a look at it and see if there is some useful info contained in it which could be consolidated into some of the content that you are actually retaining. Overall though a good content audit can be a good thing even if it means dropping some pages. Here's a useful article regarding content audits which is well worth taking a look at.
-
Sounds like a good idea to me. Make sure you have all the redirects in place to make sure when people want to visit the old content they're redirected to the new content. Also make sure you monitor the rest of your sites SEO traffic to make sure you don't fall in a hidden trap.
-
I think this pruning process makes sense. Although this will potentially decrease key words it will streamline the navigation for the content that is actually getting traffic. This will provide a better flow and potentially a lower bounce rate. Staging these cuts and monitoring the changes seems like a good way to manage your risk.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi I'm working on an E-Commerce site and the internal Search results page is our 3rd most popular landing page. I've also seen Google has often used this page as a "Google-selected canonical" on Search Console on a few pages, and it has thousands of these Search pages indexed. Hoping you can help with the below: To remove these results, is it as simple as adding "noindex/follow" to Search pages? Should I do it incrementally? There are parameters (brand, colour, size, etc.) in the indexed results and maybe I should block each one of them over time. Will there be an initial negative impact on results I should warn others about? Thanks!
Intermediate & Advanced SEO | | Frankie-BTDublin0 -
Dfferent url of some other site is shown by Google in cace copy of our site's page
Hi, When i check cached copy of url of my site http://goo.gl/BZw2Zz , the url in cache copy shown by Google is of some other third party site. Why is Google showing third party url in our site's cached url. Did any of you guys faced any such issue. Regards,
Intermediate & Advanced SEO | | vivekrathore0 -
Keyword research when the site's subject is low volume
Hey guys, what do you do when you planning a new website and doing keyword research for a site when the avg. search volumes are relatively low. We set up run contact centres for UK charities including voice, webchat, sms, email and response fulfillment etc. It seems that people aren't really searching that often for this 'sexy subject'. Average volumes for searches with some intent/qualifier range from between 10-100 monthly searches. What sort of strategies would you adopt in this scenario? Do you optimise for what you can and then make a large focus on other digital marketing tactics such as content marketing, social media, email marketing etc. Thanks for your time guys Leo
Intermediate & Advanced SEO | | Leo_Woodhead0 -
User generated content (Comments) - What impact do they have?
Hello MOZ stars! I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have? For your information:
Intermediate & Advanced SEO | | idg-sweden
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments. My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why! 🙂 If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify. Best regards,
Danne0 -
Where is the best place to put a sitemap for a site with local content?
I have a simple site that has cities as subdirectories (so URL is root/cityname). All of my content is localized for the city. My "root" page simply links to other cities. I very specifically want to rank for "topic" pages for each city and I'm trying to figure out where to put the sitemap so Google crawls everything most efficiently. I'm debating the following options, which one is better? Put the sitemap on the footer of "root" and link to all popular pages across cities. The advantage here is obviously that the links are one less click away from root. Put the sitemap on the footer of "city root" (e.g. root/cityname) and include all topics for that city. This is how Yelp does it. The advantage here is that the content is "localized" but the disadvantage is it's further away from the root. Put the sitemap on the footer of "city root" and include all topics across all cities. That way wherever Google comes into the site they'll be close to all topics I want to rank for. Thoughts? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Two sites with same content in different countries. How does it effect SEO?
Lets say for example that we have to sites, example.com and example.co.uk. The sites has the same content in the same language. Can the sites rank well in its own country? Of course all content could be rewritten, but that is very time consuming. Any suggestions? Has anyone did this before or now a site which has?
Intermediate & Advanced SEO | | fredrikahlen0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Don't want to lose page rank, what's the best way to restructure a url other than a 301 redirect?
Currently in the process of redesigning a site. What i want to know, is what is the best way for me to restructure the url w/out it losing its value (page rank) other than a 301 redirect?
Intermediate & Advanced SEO | | marig0