Transferring old articles to new site even if they are written horribly
-
OK my question for today is...
If you currently have 300 articles on your current site but are building a new site would you transfer all of the articles over to the new site or focus on quality and rewrite the articles that had traffic? For example we have about 300 articles currently on our website 60 of which actually get traffic, We rewrote those articles to make sure they were written well. Someone thought that it would be best to simply transfer the other 240 articles over and rewrite them at another time to avoid 404 redirects. I would like your feedback on how you would approach this. Please be as detailed as possible explaining your thought process.
Thanks!
-
I would do the following:
-
Redirect every URL from the old site to the appropriate page on the new site.
-
Rewrite any content that is below standards prior to moving it over to the new site.
The old idea of having more pages meaning more internal pagerank, etc.. went out the window years ago with the Panda update.
-
-
In the end, it will be best to have all of the pages redirect. You will have a higher page count, more internal links, better overall site authority, and more chances to rank for more queries.
The first thing I would do is see what traffic the poorly-written articles have, and if it makes sense to rewrite them. Not sure what you mean by poorly written, is that an opinionated view, or is there data and feedback stating the articles are bad? If they have zero visitors over a 3 month span, I would see why.
Second, look at why the better written articles do so well. Is it the way they are written? The way the page visually is laid out? (prettier is better) Is the topic more heavily searched for? Did you share or market those posts or pages differently than the other ones? Spend a bit of time in analytics and webmaster tools to analyze some of the less-quality pages, and you might discover a trend.
If you choose to redirect, I would make sure to test all the redirects when you are done as Biron29 stated. The slightest variation in the URL will cause the redirect to fail. Screaming Frog seo spider is a great resource for testing for 404 errors, and its free up to a certain number of URLs.
-
Well in my mind the first step would be to have a really solid 301 redirect strategy. Moz has a great article here regarding redirects. I say this because the slightest name change or location change can cause a 404 error. Secondly I would focus first on the content that is most valuable to carry over first. Thirdly and lastly I would go through and evaluate the quality and relevance of the articles not receiving traffic and find out why. Then you can adjust to take calculated steps to generate traffic for these articles as well. So that long description was to say yes I would carry all the content over use your 301 redirects and canonical urls and adjust the content as needed to increase its value. Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding someone to handle crashing site/site optimization.
I need someone who can handle website/WordPress issues as they come up.For example, my site has gone down 4 times tonight, and my host can't figure it out. They also keep recommending that I optimize my site, but I don't know how. I need a go-to web person for this sort of thing. Any recommendations?
On-Page Optimization | | cbrant7770 -
Integrate a blog within an existing site
Hello everyone! I work on a website of a very small company and so far no one has ever implemented (not even thought about) a proper content strategy. The only content on the site are products description. Through Analytics I discovered lots of opportunities and topics to be covered which would massively increase the traffic, increase the engagement and (hopefully) sales. Problem is that I don't really know how to integrate a blog into the existing site; my first thought was wordpress What is the best way to do it?
On-Page Optimization | | PremioOscar0 -
On site SEO review please
I'd appreciate it its anyone could take the time to review my on site SEO and suggest improvements. it's an adult dating site at http://www.local-sex-search. All pages can be found at http://www.local-sex-search.com/sitemap
On-Page Optimization | | SamCUK0 -
Dropping Old Site After Too Many Penalties. What Do You Think About the New One?
I finally decided to drop my website after it kept losing traffic even though I spent hundreds of hours and lots of $$$ trying to recover from both Panda and Penguin. I should've started a new website a lot earlier. So here's my new website, let me know if it's worthy of Big G: http://www.webhostinghero.com/ Thank you in advance for your constructive comments!
On-Page Optimization | | sbrault740 -
Articles URL
Hello, Currently, I am parsing article base on article ID on URL request. For example:
On-Page Optimization | | JohnHuynh
To go to the content of article **What is the visa on arrival? How to get it? **I am using URL like that http://www.vietnamvisacorp.com/news/what-is-the-visa-on-arrival--how-to-get-it--245.html and base on ID 245 to get this page content.
But, now I want to optimize this URL to http://www.vietnamvisacorp.com/news/what-is-the-visa-on-arrival--how-to-get-it.html. Then I got a problem: How to get an article without ID? Thanks,0 -
Craw structure for web site about jobs?
Hi there, we have now a client who has a job offering web site.There are many craw errors in it. My question is how should the url structure in a jobs website look like and which pages should be indexed? What is the best way and tips for optimizing a job website? Now the posted jobs pages are dynamically like: examplejob.com/detail-job/1891222223/Careers-for-Mens---Womens/Experienced-Web-Design-Need I see many job websites allow their job offers to be indexed and may be this is useful because some people find jobs also when directly search in Google. Are they using dynamically urls for that? And also my related question is what happens when the job offer expires? When Google craws that page again should it be redirected to 404 page or the original job offer text should be there and just to be added info that this job offer has expired? Otherwise If only it's written that it has expired may be there will be too much duplicate content on many many pages.
On-Page Optimization | | vladokan0 -
Directory site with an URL structure dilemma
Hello, We run a site, which lists local businesses and tag them by their nature of business (similar to Yelp). Our problem is, that our category and sub-category(i.e.: www.example.com/budapest/restaurant or www.example.com/budapest/cars/spare-parts) pages are extremely weak, and get almost no traffic, but most of the traffic (95+ percent) goes for the actual business pages. While this might be a completely normal thing, I still would like to strengthen our category (listing) pages as well, as these should be the ones targeted by some of general keywords, like ‘restaurant’ or ‘restaurant+budapest’. One of the issues I have identified as a possible problem, that we do not have a clear hierarchy within the site, so while the main category pages are linked from the homepage (and the sub-categories from here), there is no bottom-up linking from the business pages back to the category pages, as the business page URLs look like this: www.example.com/business/onyx-restaurant-budapest. I think, that the good site- and url structure for the above would be like this: www.example.com/budapest/restaurant/hungarian/onyx-restaurant. My only issue is, perhaps not with the restaurants but with others, that some of the businesses have multiple tags, so they can be tagged i.e. as car saloon, auto repair and spare parts at the same time. Sometimes, they even have 5+ tags on them. My idea is, that I will try to identify a primary tag for all the businesses (we maintain 99 percent of them right now), and the rest of their tags would be secondary ones. I would then use canonicalization and mark the page with the primary tag in the url as the preferred one for that specific content. With this scenario, I might have several URLs with the same content (complete duplicates), but they would point to one page only as the preferred one, while our visitors could still reach the businesses in any preferred ways, so either by looking for car saloons, auto-repair or spare parts. This way, we could also have breadcrumbs on all the pages, which now we miss completely. Can this be a feasible scenario? Might it have a side-effect? Any hints on how to do it a better way? Many thanks, Andras
On-Page Optimization | | Dilbak0 -
Suggestions on plans to optimize my site? (NOOB)
I am currently trying to plan how to optimize my site based on keywords. I read and I understand site architecture and usability http://www.seomoz.org/blog/site-architecture-for-seo , but I am still somewhat confused about how to target each keyword per page or when http://www.seomoz.org/img/upload/splitting-keyword-targeted-.gif Let me give you an example. We build databases for SME's using 3 different technologies. One of them is MS Access. Based on PPC campaigns and keyword research some of the possible keywords might be ms access programmer ms access consultants access database experts According to the link provided, should these be separate pages? I feel if they were, our site nativigation would be cluttered and clients would not be benefiting from them at all. It might even lead to some redundant data which I believe is bad right? My feeling is to make one page and target one keyword, but I'm not sure. For example, see one of our top ranking competitors http://www.justgetproductive.com/content/access-programmer/index.php Please, look at the footer? Is that actually how I should structure my links? I hope the answer is NO! Then again, if I do just have one page targeting one keyword, what do I do about the others? Do I just try to use blog posts/articles addressing those keywords? Do I not target them at all? Thanks for any advice, please keep in mind I am just getting started. My approach is to create a plan to outline everything before I put a lot of time into it.
On-Page Optimization | | emcacace1