SEO Best practice for competitions
-
I am considering running a competition and wanted to get some feedback on SEO Best Practice.
We will have a unique competition URL - following the completion of the competition it will be 301'd to home page
Every entrant will be given a unique URL for the competition to share, if someone enters using there URL they get an extra ticket. This means we will create a large number of new unique URL's over a short period of time, the pages however will have the same content. Is this potentially bad for Duplicate content?Any advice? Perhaps a canonical tag on all unique competition entrant URLs?
Any other considerations?
-
Agreed, I'd approach this as a custom acquisition and brand awareness project as well. Especially given that the website would have a short life span and SEO is a long term investment.
If this was me I'd optimize the website for the competition name and use social networking to build awareness (ie. facebook, twitter, pinterest etc).
-
Robert
I completely understand what you are trying to do. It can and will work. However, I'd suggest you look into your current backlink profile and see if there are diverse kinds of links, natural links. Also, don't overdo any one kind of link building tactic. This is not a 100% Bait and Switch, but, I'll ask you this, if SEO did not exist, would you still do it ? Think of this as a Custom Acquisition and Brand Awareness technique while helping towards SEO.
-
How about you create dynamic links that are rewritten which redirect to the one main page and just swaps variables.
I.E.
Link shared = site.com/username --> which is actually site.com/user.php?u=username --> sets a cookie with "referral = username" ---> redirect to main "Registration" (or other) page ---> check/read in cookie and apply variables as needed.
That would solve the duplicate content issue while keeping track of referrals.
Cheers,
Oleg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
What are the Best SEO Website which you read daily
Hai Moz memebers, Can you pls suggest me some best seo websites that you people read articles everyday a part from MOZ
Intermediate & Advanced SEO | | SEO_GB1 -
The best tool
Hi friends !! I have a huge question . Which is the best tool for SEO? I am using a lot of tools but I would like to know more ways to position my website in the top . I hope that you can help me! Regards , Carlos Zambrana
Intermediate & Advanced SEO | | CarlosZambrana1 -
SEO within the URL /
If I were optimizing for 'marketing success' and my URL structure was domain.com/marketing/success would that count? I'm not sure if the '/' affects the keyword term. My assumption is that it does, but I wasn't 100% sure. Thanks!
Intermediate & Advanced SEO | | KristinaWitmer0 -
Best practice for the brand name in Page Titles
We are considering changing the way we treat our brand (TTS) in our page title tags. In MOZ I found the following advice: Optimal Format Primary Keyword - Secondary Keyword | Brand Name
Intermediate & Advanced SEO | | TTS_Group
or
Brand Name | Primary Keyword and Secondary Keyword Are these of equal merit or is the former (Primary keyword | Brand) the better route? Currently we use the second version - 'Brand | Primary Keyword' - but we are proposing to shift to 'Primary Keyword | Brand'. We currently get an awful lot of brand traffic that converts very well so I need to be sure that no harm is done as a minimum. All views appreciated. Many thanks. Jon0 -
What is the best route SEO wise for implementing a Wordpress blog that has a domain under go daddy and hosting under a third party like Kalio Commerce?
I am currently trying to implement a Blog on a Ecommerce site that has its domain set up under Go Daddy, but it is hosted under a platform called Kalio Commerce. I am trying to implement a Wordpress blog because I am most familiar with it and believe it offers more strength SEO wise but the Ecommmerce platform by Kalio does not offer the ability to install Wordpress or any CMS platform other than its own. I am stuck wondering if I could find a way to either implement a blog through a sub-folder (mydomain.com/blog) while using wordpress on hosting added to the go daddy server under their domain. If Kalio Commerce doesn't have any way of adding a blog to its own hosting is this hindering my clients ability to be fully optimized for fresh content? Now I am not too familiar with adding sub-domains with hosting under a different platform. I am more in tune with building a wordpress ecommerce site and implementing blogs under the wordpress structure. This is a leap for me and it is getting to be a little overwhelming so if any one has prior experience with this please let me know if I can find a way through this without putting my SEO consulting at risk. So far I have gotten to the conclusion that I can set up hosting for the companies site under Go Daddy even though they have their hosting under Kalio. If I am to set up hosting under Go Daddy would this cause the DNS routing to be compromised. If this is so must I set up hosting under a separate domain (myotherdomian.com) and have Kalio set up a Sub-Folder like so (mydomain.com/blog) and just have the DNS of the blog installed on (myotherdomain.com) point to the folder in Kalio hosting for mydomain.com? So now that I explained my situation, what is the best route SEO wise for implementing a Wordpress blog that has a domain under go daddy and hosting under a third party like Kalio Commerce?
Intermediate & Advanced SEO | | cscoville0 -
Server cache and SEO
I have a question about server cache and seo. For example. www.chanel.com.cn , the server is in US, and uses China Cache to improve local Chinese users access speed, so what do you think this way will work for search engines spiders too? when a spider is crawlling the website, does the content it crawl on US server or China cache? what's best practice for those kind of SEO on server side? thanks Boson
Intermediate & Advanced SEO | | topchinaseo0