Forum software penalties
-
I'm hoping to solicit some feedback on what people feel would be SEO best practices for message board/forum software. Specifically, while message boards that are healthy can generate tons of unique content, they also can generate a fair share of thin content pages.
These pages include...
- Calendar pages that can have a page for each day of each month for 10 years! (thats like 3650 pages of just links).
- User Profile pages, which depending on your setup can tend to be thin. The board I work with has 20k registered members, hence 20k user profile pages.
- User lists which can have several hundred pages.
I believe Google is pretty good at understanding what is message board content, but there is still a good chance that one could be penalized for these harmless pages. Do people feel that the above pages should be noindexed?
Another issue is that of unrelated content. Many forums have their off-topic areas (the Pub or Hangout or whatever). On our forum up to 40% of the content is off-topic (when I say content I mean number of post versus raw word count).
What are the advantages and disadvantages of such content? On one hand they expand the keywords you can rank for. On the other hand it might generate google organic traffic which you might now want because of a high bounce rate.
Does too much indexable content that is unique dilute your good content?
-
If you a bit of it on a well established site with many highly trusted links, It may have a postive effect. If you did lots of it on a brand new site, It may do more harm than good.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens when we delete all the outgoing links from a forum at once?
Hi all, We have our forum and is filled with spammy content and external nofollow links. I am just wondering if we can make all these links as plain text from hyperlinks (if we can); it's technically deleting all the external links. Will this impact negatively or positively anyhow as per Google? Please share your ideas. Thanks
Algorithm Updates | | vtmoz0 -
Should I move my forum to a subdomain?
My forum causes a lot of 403, 404, soft 404 and 522 errors. I worry about this dragging down the value of my domain and wonder if I should move it to a sub directory. forum.domain.com. I was forced to do this with a very similar site and seems to have not suffered any google penalty (I implemented a 301 redirect to each page to its corresponding page on the subdomain.
Algorithm Updates | | 321Chat0 -
Old school SEO tools / software / websites
Hey Mozzers, I am doing some research and wonder if you can help me out? Before Moz, Hubspot, Majestic, Screaming Frog and all the other awesome SEO tools we use today what were the SEO tools / software / websites that were used for aiding SEO? I guess we can add the recently closed Yahoo! Directory for starters! Thanks!
Algorithm Updates | | RikkiD220 -
Ranking Software
Hello Can anyone recommend the best rankings/tracking software - something that will track upto position 500 in the serps. Thanks
Algorithm Updates | | webguru20140 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Google penalty for one keyword?
Is it possible to get penalized by Google for a specific keyword and essentially disappear from the SERPs for that keyword but keep position for the brand (#1) and some other keywords (#4 and #7)? And how would you find out that this is what happened if there is no GWT message?
Algorithm Updates | | gfiedel0 -
Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
Hello, I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings. Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions? Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings? Thank you for your help!
Algorithm Updates | | airnwater0 -
Duplicate Content & www.3quarksdaily.com, why no penalty?
Does anyone have a theory as to why this site does not get hit with a DC penalty? The site is great, and the information is good but I just cannot understand the reason that this site does not get hit with a duplicate content penalty as all articles are posted elsewhere. Any theories would be greatly appreciated!
Algorithm Updates | | KMack0