Preparing a DotNetNuke Active Forums site for SEO push
-
I'm in the process of buying and running an existing forum that is running on DotNetNuke 5.2.0 and Active Fours 4.1. As part of the transfer, I'm asking that the site be upgraded to the latest version of DNN and AF 4.3. AF 4.3 has SEO-friendly URLs instead of the current long, ugly default URLs, and I'm looking forward to implementing that feature.
My specific question is:
What would you do to prepare for this upgrade in terms of the content, especially related to the URL changes?
I've gone into Google Analytics and downloaded content by page title, exported the first 1000 results, and put those titles into Word and corrected spelling errors in the title so URLs will be based on correct spellings.
General background:
- The site is not currently monetized, and there will not be an initial focus on monetization and likely only smaller efforts (affiliate Amazon links in a resource section) in the future.
- The site is free for users.
- I'm fine with taking a hit in organic traffic in the short term. About 1/3 of the traffic is from search engines right now, and less than 30% of the visitors are new visits.
- The site is going to continue much the same as it has until now. Same moderators, same purpose, same skin, etc.
- I have access to GA, site is verified in GWT, need to verify in Bing, and I do have root access to the server.
- I've already started working on image file sizes, both of user-submitted images and site-related images like the header.
- Until now, I have no experience with DNN or AF or any of the extensions (and am appalled at the price and lack of features of some of those extensions, compared to what I'm used to for WordPress).
More general questions:
In terms of SEO, I'm intending to treat the upgrade of the forum with the friendly URLs as a re-launch. I'm wanting good URLs, put in a site map, fix non-www to www, etc. When I start making the changes and submitting the site map and generally drawing Google's attention, I want Google to like what it sees, and have as much optimized as possible when googlebot comes around. My goal is to draw more targeted visitors from search that are interested in the content in the site.
What other suggestions do you have for the site prep, both from being a forum in general and specifically on DNN/AF?
I'm not putting the URL out just yet, as we haven't announced to the users the change of ownership is taking place.
Thanks everyone!
-
We've announced the transition, so I can post the URL here. It's http://rcnavalcombat.com/.
We're running out of HDD space on the current virtual server, so I'm shopping for a larger package, and will be setting up the new server with the latest in DNN and AF, tweaking things the way I want, and importing the existing site (and rinse and repeat about five times to get bugs worked out I imagine) then going live with the upgrades on the new server.
-
I haven't done so, I have only user end experience with this. However, I think that would be a good idea to do so, especially if the topic is a good one or receives several replies. A small update to the policies should clear the way to allow this.
I cannot recommend many modules, but I can say buyer beware. I would stay with DNN features like http://www.snowcovered.com/snowcovered2/Default.aspx?tabid=242&PackageID=18835. Using others can crash your system, remove data tables or just plain out not work as described and you may end up spending more money to hire that company to trouble shoot their own product.
-
I've used that sitemap strategy before (a separate sitemap for the different major areas of the site) and had good success with that. I'm going through GA and taking a look at the pages with the most visits and making sure those pages don't have glaring errors for a human or bot visitor, such as missing/mis-sized images, horrid misspellings, etc.
Do you have any advice regarding modules to use or avoid?
More of a general forum question than a DNN question, I'm wondering what the schools of thought regarding changing or cleaning up thread titles are. Do you ever go back and make things more clear, such as changing "a noob question" to "a noob question about [subject]", or standardize thread titles (in a product review section for example).
Does anyone have resources or forums where forum managers can go to answer this type of question (about any changes made to UGC)? I'm wondering where other forum mods hang out.
-
I run our sites on DNN and am going through this same transition. I have decided that based on the conversion Rand had with the head Binger and the Mozinar about Sitemaps, I am going to create a few of them, each based around a specific category, i.e. Services.xml.
This, from my understanding, will allow me to monitor which pages of each sitemap has been indexed and try to help out those that haven't yet been crawled. Then I will keep a sitemap of the sitemaps. With all that being said make sure the sitemap is free from errors, DNN default tabid urls and contains your best of pages.
I think as a forum, this can be great for you if your primary topics contain several ongoing subtopic that gain a lot of attention. Your first steps of capturing what you have already seem to be spot on. Let us know how this works out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO issues with removing a forum from a site
I'm thinking of adding a Discourse discussion forum to one of my websites. I'm not sure if it's going to be something that works well or not for the site. So I'm thinking ahead and wondering what Google issues I could have if after a few months of having the forum, I decide to remove it. What would Google think about all the then non-existent pages it might have indexed? Would there be a simple wildcard redirect I could do in htaccess that would satisfy that? Or some other thing I should do?
Intermediate & Advanced SEO | | bizzer0 -
Is SEO as Effective on AJAX Sites?
Hey Everyone, I had a potential client contact me about doing SEO for their site and I see that they have an AJAX site where all the content is rendered dynamically via AJAX. I've been doing SEO for years, but never had a client with an AJAX site. I did a little research and see how you can setup alternative pages (or snapshots as Google calls them) with the actual content so the pages are crawlable and will get indexed, but I'm wondering if that is as effective as optimizing static HTML pages or if Google treats AJAX page alternatives as less trustworthy/valuable. Also, does having the site in AJAX effect link building and social sharing? With the link structure, it seems there could be some issues with pointing links and passing link juice to internal pages Thanks! Kurt
Intermediate & Advanced SEO | | Kurt_Steinbrueck1 -
Joomla SEO
With so many articles on the web talking about how difficult Joomla is to work with in regards to SEO, I'm curious as to what techniques / changes you guys make when using Joomla with your SEO / inbound practices? Any extensions that you love? An extensions that you hate?
Intermediate & Advanced SEO | | DougHoltOnline0 -
SEO and former site
Hi, my client had a site built and hosted with Avvo but we now shut it down and are using a new server. My concern is that Avvo's internal link structure is causing SEO issues. For example, his site will list for "San Diego Criminal Defense Attorney", but is then removed for no reason. Far worse, while he had the AVVO site, it would never rank at all on Google. He's got great content, and no spammy links. This is the site: www.thesandiegocriminallawyer.com. Any thoughts of what I could do to disavow the AVVO pages that Google still has indexed? Does it matter? Or, is it simply a function of time? Thank you for your help.
Intermediate & Advanced SEO | | mrodriguez14400 -
Strange situation - Started over with a new site. WMT showing the links that previously pointed to old site.
I have a client whose site was severely affected by Penguin. A former SEO company had built thousands of horrible anchor texted links on bookmark pages, forums, cheap articles, etc. We decided to start over with a new site rather than try to recover this one. Here is what we did: -We noindexed the old site and blocked search engines via robots.txt -Used the Google URL removal tool to tell it to remove the entire old site from the index -Once the site was completely gone from the index we launched the new site. The new site had the same content as the old other than the home page. We changed most of the info on the home page because it was duplicated in many directory listings. (It's a good site...the content is not overoptimized, but the links pointing to it were bad.) -removed all of the pages from the old site and put up an index page saying essentially, "We've moved" with a nofollowed link to the new site. We've slowly been getting new, good links to the new site. According to ahrefs and majestic SEO we have a handful of new links. OSE has not picked up any as of yet. But, if we go into WMT there are thousands of links pointing to the new site. WMT has picked up the new links and it looks like it has all of the old ones that used to point at the old site despite the fact that there is no redirect. There are no redirects from any pages of the old to the new at all. The new site has a similar name. If the old one was examplekeyword.com, the new one is examplekeywordcity.com. There are redirects from the other TLD's of the same to his (i.e. examplekeywordcity.org, examplekeywordcity.info), etc. but no other redirects exist. The chances that a site previously existed on any of these TLD's is almost none as it is a unique brand name. Can anyone tell me why Google is seeing the links that previously pointed to the old site as now pointing to the new? ADDED: Before I hit the send button I found something interesting. In this article from dejan SEO where someone stole Rand Fishkin's content and ranked for it, they have the following line: "When there are two identical documents on the web, Google will pick the one with higher PageRank and use it in results. It will also forward any links from any perceived ’duplicate’ towards the selected ‘main’ document." This may be what is happening here. And just to complicate things further, it looks like when I set up the new site in GA, the site owner took the GA tracking code and put it on the old page. (The noindexed one that is set up with a nofollowed link to the new one.) I can't see how this could affect things but we're removing it. Confused yet? I'd love to hear your thoughts.
Intermediate & Advanced SEO | | MarieHaynes0 -
Load balanced Site
Our client ecommerce site load from 3 different servers using load balancing. abc.com : IP: 222.222.222 Abc.com: IP: 111.111.111 For testing purpose 111.111.111 also point to beta.abc.com Now google crawling site beta.abc.com If we block beta.abc.com using robots.txt it will block google bot also , since beta.abc.com is really abc.com I know its confusing but I been trying to figure out. Ofcourse I can ask my dev to remove beta.abc.com make a seperate code and block it using .htaccess
Intermediate & Advanced SEO | | tpt.com0 -
Press Release Sites
Ok, I am getting a lot of conflicting information about press release sites. i have been doing press release's for a while (mostly manually), I have also tried a few companies that claim to do it well (never do). After the Panda update the PR sites I have been using are just not as effective. Does anyone else have this problem or are there better PR sites that can be recommended.
Intermediate & Advanced SEO | | TomBarker820 -
Push for site-wide https, but all pages in index are http. Should I fight the tide?
Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis
Intermediate & Advanced SEO | | dennis.globalsign0