Need help with best practices on eliminating old thin content blogs.
-
We have about 100 really old blog posts that are nothing more than a short trip review w/ images. Consequently these pages are poor quality. Would best practices be to combine into one "review page" per trip, reducing from 100 to about 10 better pages and implement redirects? Or is having more pages better with less redirects? We only have about 700 pages total. Thanks for any input!
-
If you have content thin pages on your website, we recommend instead adding high-quality, well-written content marketing to replace that work.
After completing this for our company that sells bath garden offices, we rewrote all of the pages, added high-quality white hat evergreen content marketing, and this significantly increased the organic seo in Bath. As a helpful result, many more garden rooms in the area
-
Link to mother category
Similar to my site : Source
-
Thank you Paddy! That helps a lot...
-
Hi there,
I'd say that your first solution is likely to be the best one. It's far better to have a smaller number of high quality pages than a high number of low/questionable quality pages. I'd also recommend thinking about whether it makes sense to combine the content or not when it comes to the user. If the trip reviews would all make sense on one page and add value, then it's another reason to consolidate. If they aren't really relevant to each other, you may want to be very selective with which ones you combine and possible add some new, fresh content to the pages at the same time.
Also think about keyword targeting for these pages and review how much traffic they already get and use Google Search Console to understand which keywords drive traffic. Whilst the pages may be low quality, if they drive some decent traffic, you may not want to lose that. So if the keywords that are sending traffic are the right ones for you, try to carry them over to the new, consolidated pages where you can.
Hope that helps!
Paddy
-
First you need to review the entire content and
Keyword Cannibalization. Consolidate articles with similar keywords and content. Then arrange and divide into categories and Hub Page reasonably. Example 1 My Blog Page is split up here.
You will have to spend a lot of time and research thoroughly about Keyword Cannibalization.
Hope to get a good response
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are the best practices for geo-targeting by sub-folders?
My domain is currently targeting the US, but I'm building out sub-folders that will need to geo-target France, England, and Spain. Each country will have it's own sub-folder, and professionally translated (domain.com/france). Other than the hreflang tags, what are other best practices I can implement? Can Google Webmaster tools geo-target by subfolder? Any suggestions would be appreciated. Thanks Justin
Intermediate & Advanced SEO | | Rhythm_Agency0 -
What are the best practices with website redesign & redirects?
I have a website that is not very pretty but has great rankings. I want to redesign the website and loose as little rankings as possible and still clean up the navigation. What are the best practices? Thanks in advance.
Intermediate & Advanced SEO | | JHSpecialty0 -
Best practice for H1 on site without H1 - Alternative methods?
I have recently set up a mens style blog - the site is made up of articles pulled in from a CMS and I am wanting to keep the design as clean as possible - so no text other than the articles. This makes it hard to get a H1 tag into the page - are there any solutions/alternatives? that would be good for SEO? The site is http://www.iamtheconnoisseur.com/ Thanks
Intermediate & Advanced SEO | | SWD.Advertising0 -
Just found a wordpress blog duplicating main website blog - what to do?
Hello Mozzers, I am working on a website and found the social media agency, employed by the website owner, was running a parallel wordpress blog which duplicates the content on the main website's blog (200 odd pages of this duplicating wordpress blog are indexed - the duplication is exact other than for non-blog content pages - around 60 category, date pages, homepage, etc. I am planning to 301 redirect the wordpress blog pages to equivalent pages on website blog, and then 301 redirect the homepage, category and date pages, etc. to the website blog homepage, so all the blog pages redirect to somewhere on main website. _Does this make sense, or should I only worry about redirecting the blog content pages? _ Also, the main website is new and there are redirects coming in to pages from old website already. _Is there anything to be cautious about when redirecting to a main website from multiple old websites? _ Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Retail Store Detail Page and Local SEO Best Practices
We are working with a large retailer that has specific pages for each store they run. We are interested in leveraging the best practices that are out their specifically for local search. Our current issue is around URL design for the stores pages themselves. Currently, we have store URL's such as: /store/12584 The number is a GUID like character that means nothing to search engines or, frankly, humans. Is there a better way we could model this URL for increased relevancy for local retail search? For example: adding store name:
Intermediate & Advanced SEO | | mongillo
www.domain.com/store/1st-and-denny-new-york-city/23421
(example http://www.apple.com/retail/universityvillage/) fully explicit URI www.domain.com/store/us/new-york/new-york-city/10027/bronx/23421
(example http://www.patagonia.com/us/patagonia-san-diego-2185-san-elijo-avenue-cardiff-by-the-sea-california-92007?assetid=5172) the idea with this second version is that we'd make the URL structure more rich and detailed which might help for local search. Would there be a best practice or recommendation as to how we should model this URL? We are also working to create an on-page optimization but we're specifically interested in local seo strategy and URL design.0 -
Moving some content to a new domain - best practices to avoid duplicate content?
Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?
Intermediate & Advanced SEO | | Citybase0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Duplicate content
Is there manual intervention required for a site that has been flagged for duplicate content to get back to its original rankings, once the duplicated content has been removed? Background: Our site recently experienced a significant drop in traffic around the time that a chunk of content from other sites (ie. duplicate) went live. While it was not an exact replica of the pages on other sites, there was quite a bit of overlap. That content has since been removed, but our traffic hasn't improved. What else can we do to improve our ranking?
Intermediate & Advanced SEO | | jamesti0