Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Help FORUM ( User generated content ) SEO best practices
-
Hello Moz folks !
For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC.
I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated.
Best,
Yan
-
do logged in-user and anonymous user should have the same behavior ?
For the most part, yes, however it depends on the forum you are running. The important piece to understand is that whatever is hidden behind a login wall, remains hidden to the search engines. So, you have to weigh that factor when deciding which content to display to everyone versus the content to display to only logged in users.
How do you suggest handling canonical in a UGC world ?
Canonicalization isn't too hard to manage. Your forum software should include canonical URLs, but if not you will want those implemented into the template as soon as possible. The use of the rel=prev and rel=next tags are highly recommended. This allows you to keep the main forum thread as the canonical URL and Google understands that the subsequent pages are related to the main page and how they add value.
Do you have specific editorial guidelines enforced on UGC ?
Again, that's up to you and your community. What work editorially for one forum may not be the most desirable for another (e.g. the use of profanity). As long as the content being added is of value, then I consider it good content. With forums, you can be a lot more loose with the guidelines and allow users to interact as they desire.
Don't let your forum become infested with Spam, obvious self-promoting threads, and make sure all links are nofollow. Many forums implement restrictions on users in regards to links and only when they prove themselves can they add links to their posts. Link and Spam management are very important for forums.
-
Thanks Ray-PP,
Is there any specific ? Exemple , do logged in-user and anonymous user should have the same behavior ? How do you suggest handling canonical in a UGC world ? Do you have specific editorial guidelines enforced on UGC ? Exemple should we noindex a post with a 3 word question and an image ?
Cheers,
Yan
-
Hello Yan,
Fortunately, the on-site SEO for UGC is not very different from the on-site SEO of other forms of content. We can still apply those best-practices to the forums and UGC you're experiencing in forums.
Duplicate content / on-page factors
- Make sure the forum is using proper canonicalization
- use of rel=prev/next for paginated threads
- Semantic SEO where appropriate
- Make sure to have all on-page SEO factors optimized (title, headings, images optimized, ect)
Broken links
- Use Moz or a tool like Screaming Frog SEO Spider to identify 404 pages. Redirect any important pages with a 301 to its nearest related page (save SEO authority for the important dead pages).
Are there more specific issues you are experiencing with the forum?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
Best SEO Strategy for Badges & Awards.
Hello Moz Friends! I was wondering what the correct "SEO friendly" strategy is with badges and awards. We recently got BBB accredited and added their badge to the footer of the website. We also added a review badge from shopper approved to the footer. As I'm joining other communities, I see there's badges given to us. For example, Alignable. Great place for networking. They offer a badge that says "locals recommend us" or something. Should I embed these badges onto our website someplace? Should I create a page for just badges or place them in the footer or sidebar widgets? What the best SEO practice for this? Thank you!!
Intermediate & Advanced SEO | | LindsayE2 -
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Paragraphs/Tables for Content & SEO
Hi Does anyone know if Google prefers paragraphs over content in a table, or doesn't it make much difference?
Intermediate & Advanced SEO | | BeckyKey0 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Submitting Same Press Release Content to Multiple PR Sites - Good or Bad Practice?
I see some PR (press release) sites where they distribute the same content on many different sites and at end they give the source link is that Good SEO Practice or Bad ? If it is Good Practice then how Google Panda or other algorithms consider it ?
Intermediate & Advanced SEO | | KaranX0 -
How do you archive content?
In this video from Google Webmasters about content, https://www.youtube.com/watch?v=y8s6Y4mx9Vw around 0:57 it is advised to "archive any content that is no longer relevant". My question is how do you exactly do that? By adding noindex to those pages, by removing all internal links to that page, by completely removing those from the website? How do you technically archive content? watch?v=y8s6Y4mx9Vw
Intermediate & Advanced SEO | | SorinaDascalu1 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0