Is this the best way to get rid of low quality content?
-
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination.
However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process).
Some advise on how to proceed from here would be fantastico and danke
<colgroup><col width="493"></colgroup>
-
Hi! I've asked for another associate with more Panda experience than I to come in and comment on this question.
Byork, knowing a little more about your trailing slash issue could help out. Do you have trailing slash redirects in place for all of your pages? Were they put in at a particular time, where you might be able to look at data from just after that date?
If the trailing slashes are in place correctly and always have been, and it's just some weird artifact of GA that is causing these pages to show up with 0 visits, can you ignore those pages that don't have the trailing slash and focus just on the metrics for those with the trailing slash?
-
rel=canonical is more for when there are parameters on your URLs that you can't really do anything about. When you know one URL is being served, but should be another, you should use a 301 redirect. So in this case, you should pick which URL you like better, either with or without the trailing slash, and redirect one to the other. Google treats both of these as two completely separate pages, which is why you're seeing views on one and not the other. If you can't configure the redirect, then you could resort to rel=canonical.
If you have pages with similar content but not a lot of views, then 301 redirecting that page to another page with more views would be fine. That'll pass it's pagerank along, and good for people who find that original URL later, because they'll go to an actual page instead of your 404 page.
-
Great question.
I'd appreciate a pro seo opinion on this, but here's what I am doing on our website.
To Rel Canonical or 301? That is the question for the /countries/Panama to countries/Panama/ and the other examples like that.
On the other pages, what about moving the best part of the content from a low view page to a similar content higher view page and then 301 the old page to the better page?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Does cached duplicate content hurts seo by Google
If we have duplicate content or pages cached in Google which has been indexed months back, still it hurts the original pages? Old URLs with cache can be seen now in Google when we search for the same URLs.
Algorithm Updates | | vtmoz0 -
What is your experience with markups (schema.org) in terms of SEO and best practice learnings?
Hi, I am looking to implement schema markups into a variety of websites and currently wondering about best practices. I am working on energy providers, building material, e-retailers, social association among others. While I understand every single one of these is an individual case, I could do with some advices from you, guys. Which markups would you consider key for search engines? I would have naturally chosen markups to highlight the business name, location and products but there is so much more to schema.org! Thanks,
Algorithm Updates | | A_Q0 -
Duplicate Content
I was just using a program (copyscpape) to see if the content on a clients website has been copied. I was surprised that the content on the site was displaying 70% duplicated and it's showing the same content on a few sites with different % duplicated (ranging from 35%-80%) I have been informed that the content on the clients site is original and was written by the client. My question is, does Google know or understand that the clients website's content was created as original and that the other sites have copied it word-for-word and placed it on their site? Does he need to re-write the content to make it original? I just want to make sure before I told him to re-write all the content on the site? I'm well aware that duplicate content is bad, but i'm just curious if it's hurting the clients site because they originally created the content. Thanks for your input.
Algorithm Updates | | Kdruckenbrod0 -
Creating Content for Semantic search?
Need some good examples of semantic search friendly content. I have been doing a lot of reading on the subject, but have seen no real good examples of 'this is one way to structure it'. Lots of reading on the topic from an overall satellite perspective, but no clear cut examples I could find of "this is the way the pieces should be put together in a piece of content and this is the most affective ways to accomplish it". **What I know: ** -It needs to answer a question that precludes the 'keyword being used' -It needs to or should be connected to authorship for someone in that topic industry -It should incorporate various social media sources as reference to the topic -It should link out to authoritative resources on the topic -It should use some structured data markup Here is a great resource on the important semantic search pieces: http://www.seoskeptic.com/semantic-seo-making-shift-strings-things/ ,but I want to move past the research into creating the content that will make the connections needed to get the content to rank. I know Storify is an excellent medium to accomplish this off page, but only gives no follow attribution to the topic creator and links their in. I am not a coder, but a marketer and creating the backend markup will really take me out of my wheel house. I don't want to spend all of my time flailing with code when I should be creating compelling semantic content. Any helpful examples or resources welcome. Thanks in advance.
Algorithm Updates | | photoseo10 -
Getting listed in the Google local result - help!
Good day, I'm really struggling to get a client to appear in the Google Local map snapshot (on the right of the SERPs), even when their company name is Googled. I've tried everything including getting the main Google Local account verified, had some reviews put up, all the required and relevant info has been completed, yet their location and the map never appear. Any help out there as to how I can remedy this? Thanks
Algorithm Updates | | Martin_S1 -
Does anyone know what it takes to get your Google Plus statuses to show up under the Knowledge Graph?
I've been looking into G+ and how to get the information and status updates in to the Knowledge Graph for small companies and have not been able to. Does anyone know exactly how to do it?
Algorithm Updates | | DragonSearch1 -
Shouldn’t Google always rank a website for its own unique, exact +10 word content such as a whole sentence?
Hello fellow SEO's, I'm working with a new client who owns a property related website in the UK.
Algorithm Updates | | Qasim_IMG
Recently (May onwards) they have experienced significant drops in nearly all non domain/brand related rankings. From page 1 to +5 or worse. Please see the attached webmaster tools traffic graph.
The 13th of June seemed to have the biggest drop (UK Panda update???) When we copy and paste individual +20 word sentences from within top level content Google does bring up exact results, the content is indexed but the clients site nearly always appears at the bottom of SERP's. Even very new or small, 3-4 page domains that have clearly all copied all of their content are out ranking the original content on the clients site. As I'm sure know, this is very annoying for the client! And this even happens when Google’s cache date (that appears next to the results) for the clients content is clearly older then the other results! The only major activity was the client utilising Google optimiser which redirects traffic to various test pages. These tests finished in June. Details about the clients website: Domain has been around for 4+ years The website doesn't have a huge amount of content, around 40 pages. I would consider 50% original, 20% thin and 30% duplicate (working on fixing this) There haven’t been any signicant sitewide or page changes. Webmaster tools show nothing abnormal or any errors messages (some duplicate meta/title tags that are being fixed) All the pages of the site are indexed by Google Domain/page authority is above average for the niche (around 45 in for the domain in OSE) There are no ads of any kind on the site There are no special scripts or anything fancy that could cause problems I can't seem to figure it out, I know the site can be improved but such a severe drop where even very weak domains are out ranking suggests a penalty of some sort? Can anyone help me out here? hxuSn.jpg0