Need help with best practices on eliminating old thin content blogs.
-
We have about 100 really old blog posts that are nothing more than a short trip review w/ images. Consequently these pages are poor quality. Would best practices be to combine into one "review page" per trip, reducing from 100 to about 10 better pages and implement redirects? Or is having more pages better with less redirects? We only have about 700 pages total. Thanks for any input!
-
If you have content thin pages on your website, we recommend instead adding high-quality, well-written content marketing to replace that work.
After completing this for our company that sells bath garden offices, we rewrote all of the pages, added high-quality white hat evergreen content marketing, and this significantly increased the organic seo in Bath. As a helpful result, many more garden rooms in the area
-
Link to mother category
Similar to my site : Source
-
Thank you Paddy! That helps a lot...
-
Hi there,
I'd say that your first solution is likely to be the best one. It's far better to have a smaller number of high quality pages than a high number of low/questionable quality pages. I'd also recommend thinking about whether it makes sense to combine the content or not when it comes to the user. If the trip reviews would all make sense on one page and add value, then it's another reason to consolidate. If they aren't really relevant to each other, you may want to be very selective with which ones you combine and possible add some new, fresh content to the pages at the same time.
Also think about keyword targeting for these pages and review how much traffic they already get and use Google Search Console to understand which keywords drive traffic. Whilst the pages may be low quality, if they drive some decent traffic, you may not want to lose that. So if the keywords that are sending traffic are the right ones for you, try to carry them over to the new, consolidated pages where you can.
Hope that helps!
Paddy
-
First you need to review the entire content and
Keyword Cannibalization. Consolidate articles with similar keywords and content. Then arrange and divide into categories and Hub Page reasonably. Example 1 My Blog Page is split up here.
You will have to spend a lot of time and research thoroughly about Keyword Cannibalization.
Hope to get a good response
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Best practice to 301 NON-WWW pages?
Hi Guys, Have a site which has 302 redirects installed for pages like: https://domain.com.au/ to https://www.domain.com.au/ (302 redirect) Is it worth changing the redirect to a 301? This is a large site, like 10,000 pages. Also does anyone know how can this be done via Magento? Cheers
Intermediate & Advanced SEO | | bridhard80 -
IFrames and Thin Content Worries
Hi everyone, I've read a lot about the impact of iFrames on SEO lately -- articles like http://www.visibilitymagazine.com/how-do-iframes-affect-your-seo/ for example. I understand that iFrames don't cause duplicate content or cloaked content issues, but what about thin content concerns? Here's my scenario: Our partner marketing team would like to use an iframe to pull content detailing how Partner A and my company collaborate from a portal the partners have access to. This would allow the partners to help manage their presence on our site directly. The end result would be that Partner A's portal content would be added to Partner A's page on our website via an iFrame. This would happen about across at least 100 URLs. Currently we have traditional partner pages, with unique HTML content. There's a little standalone value for queries involving the bigger partners' names + use case terms, but only in less than 10% of cases. So I'm concerned about those pages, but I'm more worried about the domain overall. My main concern is that in the eyes of Google I'd be stripping a lot of content off the domain all at once, and then replacing it with these shell pages containing nothing (in terms of SEO) but meta, a headline, navigation links, and an iFrame. If that's the case, would Google view those URLs as having thin content? And could that potentially impact the whole domain negatively? Or would Google understand that the page doesn't have content because of the iFrames and give us a pass? Thoughts? Thanks, Andrew
Intermediate & Advanced SEO | | SafeNet_Interactive_Marketing0 -
Moving blog to a subdomain, how can I help it rank?
Hi all, We recently moved our blog to a sub-domain where it is hosted on Wordpress. It was very recent and we're actively working on the SEO, but any pointers on getting the subdomain to rank higher than the old blog posts would be terrific. Thanks!
Intermediate & Advanced SEO | | DigitalMoz0 -
Is it a good idea to remove old blogs?
So I have a site right now that isn't ranking well, and we are trying everything to help it out. One of my areas of concern is we have A LOT of old blogs that were not well written and honestly are not overly relevant. None of them rank for anything, and could be causing a lot of duplicate content issues. Our newer blogs are doing better and written in a more Q&A type format and it seems to be doing better. So my thought is basically wipe out all the blogs from 2010-2012 -- probably 450+ blog posts. What do you guys think?
Intermediate & Advanced SEO | | netviper1 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Best way to handle traffic from links brought in from old domain.
I've seen many versions of answers to this question both in the forum, and throughout the internet... However, none of them seem to specifically address this particular situation. Here goes: I work for a company that has a website (www.example.com) but has also operated under a few different names in the past. I discovered that a friend of the company was still holding onto one of the domains that belonged to one of the older versions of the company (www.asample.com) and he was kind enough to transfer it into our account. My first reaction was to simply 301 redirect the older to the newer. After I did this, I discovered that there were still quite a few active and very relevant links to that domain, upon reporting this to the company owners they were suddenly concerned that a customer may feel misdirected by clicking www.asample.com and having www.example.com pop up. So I constructed a single page on the old domain that explained that www.asample.com was now called www.example.com and provided a link. We recently did a little house cleaning and moved all of our online holdings "under one roof" so to speak, and when the rep was going over things with the owners began to exclaim that this was a horrible idea, and that domain should instead be linked to it's own hosting account, and wordpress (or some other CMS) should be installed, and a few pages of content about the companies/subject should be posted. So the question: Which one of these is the most beneficial to the site and the business that are currently operating (www.example.com?) I don't see a real problem with any of these answers, but I do see a potentially un-needed expense in the third solution if a simple 301 will bring about the most value. Anyone else dealt with a situation like this?
Intermediate & Advanced SEO | | modulusman0 -
Advice needed on how to handle alleged duplicate content and titles
Hi I wonder if anyone can advise on something that's got me scratching my head. The following are examples of urls which are deemed to have duplicate content and title tags. This causes around 8000 errors, which (for the most part) are valid urls because they provide different views on market data. e.g. #1 is the summary, while #2 is 'Holdings and Sector weightings'. #3 is odd because it's crawling the anchored link. I didn't think hashes were crawled? I'd like some advice on how best to handle these, because, really they're just queries against a master url and I'd like to remove the noise around duplicate errors so that I can focus on some other true duplicate url issues we have. Here's some example urls on the same page which are deemed as duplicates. 1) http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Holdings-and-sectors-weighting?s=IVPM:LSE http://markets.ft.com/Research/Markets/Tearsheets/Summary?s=IVPM:LSE&widgets=1 What's the best way to handle this?
Intermediate & Advanced SEO | | SearchPM0