Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Help FORUM ( User generated content ) SEO best practices
-
Hello Moz folks !
For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC.
I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated.
Best,
Yan
-
do logged in-user and anonymous user should have the same behavior ?
For the most part, yes, however it depends on the forum you are running. The important piece to understand is that whatever is hidden behind a login wall, remains hidden to the search engines. So, you have to weigh that factor when deciding which content to display to everyone versus the content to display to only logged in users.
How do you suggest handling canonical in a UGC world ?
Canonicalization isn't too hard to manage. Your forum software should include canonical URLs, but if not you will want those implemented into the template as soon as possible. The use of the rel=prev and rel=next tags are highly recommended. This allows you to keep the main forum thread as the canonical URL and Google understands that the subsequent pages are related to the main page and how they add value.
Do you have specific editorial guidelines enforced on UGC ?
Again, that's up to you and your community. What work editorially for one forum may not be the most desirable for another (e.g. the use of profanity). As long as the content being added is of value, then I consider it good content. With forums, you can be a lot more loose with the guidelines and allow users to interact as they desire.
Don't let your forum become infested with Spam, obvious self-promoting threads, and make sure all links are nofollow. Many forums implement restrictions on users in regards to links and only when they prove themselves can they add links to their posts. Link and Spam management are very important for forums.
-
Thanks Ray-PP,
Is there any specific ? Exemple , do logged in-user and anonymous user should have the same behavior ? How do you suggest handling canonical in a UGC world ? Do you have specific editorial guidelines enforced on UGC ? Exemple should we noindex a post with a 3 word question and an image ?
Cheers,
Yan
-
Hello Yan,
Fortunately, the on-site SEO for UGC is not very different from the on-site SEO of other forms of content. We can still apply those best-practices to the forums and UGC you're experiencing in forums.
Duplicate content / on-page factors
- Make sure the forum is using proper canonicalization
- use of rel=prev/next for paginated threads
- Semantic SEO where appropriate
- Make sure to have all on-page SEO factors optimized (title, headings, images optimized, ect)
Broken links
- Use Moz or a tool like Screaming Frog SEO Spider to identify 404 pages. Redirect any important pages with a 301 to its nearest related page (save SEO authority for the important dead pages).
Are there more specific issues you are experiencing with the forum?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Submitting Same Press Release Content to Multiple PR Sites - Good or Bad Practice?
I see some PR (press release) sites where they distribute the same content on many different sites and at end they give the source link is that Good SEO Practice or Bad ? If it is Good Practice then how Google Panda or other algorithms consider it ?
Intermediate & Advanced SEO | | KaranX0 -
Mega Menu Navigation Best Practice
First off, I'm a landscape/nature/travel photographer. I mainly sell prints of my work. I'm in the process of redesigning my website, and I'm trying to decide whether to keep the navigation extremely simple or leave the drop-down menu for galleries. Currently, my navigation is something like this: Galleries
Intermediate & Advanced SEO | | shannmg1
> Gallery for State or Country (example: California)
> Sub-region in State or Country (example: San Francisco)
Blog
Prints
About
Contact Selling prints is the top priority of the website, as that's what runs the business. I have lots of blog content, and I'm starting to build some good travel advice, etc. but in reality, the galleries, which then filter down to individual pages for each photo with a cart system, are the most important. What I'm struggling to decide is whether to leave the sort of "mega menu" for the galleries, or to do away with them, and have the user go to the overall galleries page to navigate further into the site. Leaving the mega menu intact, the galleries page becomes a lot less important, and takes out a step to get to the shopping cart. However, I'm wondering if the amount of galleries in the drop down menu is giving TOO many choices up front as well. I also wonder how changing this will affect search. Any thoughts on which is better or is it really just a matter of preference?0 -
SEO time
I wanto to be in the top of the google search. I am usiing a lot of SEO tools but... I have done it during one month. Do I have to wait more?
Intermediate & Advanced SEO | | CarlosZambrana0 -
Membership/subscriber (/customer) only content and SEO best practice
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice? A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!) Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Best Practice For Company/Client Logo Endorsement
Article: http://searchengineland.com/homepage-sliders-are-bad-for-seo-usability-163496 I came across the following article and somewhat agree with the authors summary.
Intermediate & Advanced SEO | | Mark_Ch
I find sliders a distraction to B2B users and overall offers no SEO benefits. Scenario
As a service provider, over time I have worked with many high profile blue chip comnpanies. As part of my site redesign, I'm looking to show users my client achievements. My initial thoughts are to carry out the following: On the home page I'm looking to incorporate some high profile company logos (similar to http://www.semrush.com) with a hyperlink "more customers" to the right of logo caption. The link will take the user to a dedicated page (www.mydomain.co.uk/customer) showing a comprehensive list of company logos. Questions
#1 Is the above practice good or bad.
#2 Is there a better way to achieve the above Any other practical advise on user experience, social engagement, website speed, etc would be much appreciated. Thanks Mark0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0