Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
-
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install.
Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets.
The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.comShould we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com?
Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution?
Thank you so much for the help.
-
Hi Adam,
Are the ticket pages on the sub domain the same as the event pages on the main domain except with the ticketing system included? If yes it would make more sense to canonical each event ticketing page back to the same event page so: tickets.domain.com/event1 -> domain.com/event1.
If the ticketing pages are not meant to be indexed at all then I would put the robots no index tag on them also (or a robots file on the whole subdomain) and keep an eye on GWT to make sure none of them creep in. Canonical tags are a 'recommendation' not a rule so if your plans are for these pages to not be indexed at all best to ensure that as completely as possible.
-
Hey Leo!
Thanks for the taking the time to answer me. I am going to set this up exactly as you recommend.
1. I will install the same GA code from domain.com on tickets.domain.com
2. Do you think I need to set the canonical URL on the various ticketing pages all back to the main domain?
e.g. tickets.domain.com ---> canonical to domain.com
e.g. tickets.domain.com/event1 ---> canonical to domain.com
e.g. tickets.domain.com/event2 ---> canonical to domain.com
e.g. tickets.domain.com/event3 ---> canonical to domain.com
and so on?3. We did make all the header links of tickets.domain.com point straight back to their counterpart on domain.com.
Does this seem like I basically got it all correct?
Thanks again
Adam -
HI,
If technically that is the best solution for your needs then a couple of things to keep in mind:
1. If you are using Universal Analytics subdomain tracking is included by default so if you put the same analytics code on your subdomain pages then you should not be seeing any 'bounces' - google should be able to figure this out also.
2. You can install GWT for the subdomain also. I dont think you can set the preferred domain for a subdomain setup but you can use GWT to monitor issues and make sure that duplicate pages for the subdomain are not getting indexed.
3. To avoid indexing of the subdomain pages (which I assume you don't want) you could canonical them to their equivalent on the www domain. You could also meta robots no-index them all. If they creep in anyway you can use GWT to remove them.
If the subdomain is a complete clone and the experience is seamless then why not make all links on the subdomain go back to the www domain pages. That way the only pages available on the subdomain would be the ticketing pages and the rest would be on the www as normal.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
Which are the best off-page SEO techniques for 2020?
I have just published an awesome website or blog, and i really worked hard keeping everything perfect. Do you think it’s enough? Having a perfect blog, website or business is just enough. i need readers for my blog, visitors to my website, and customers for my business. So, what to do?
Local Website Optimization | | boxinghunter0 -
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
Research on industries that are most competitive for SEO?
I am trying to see if there is a reputable / research-backed source that can show which industries are most competitive for search engine optimization. In particularly, I'd be interested in reports / research related to the residential real estate industry, which I believe based on anecdotal experience to be extremely competitive.
Local Website Optimization | | Kevin_P3 -
Impact of .us vs .com on SEO rankings?
Our website is hosted on www.discovered.us. I have 2 questions: 1: we have had regular feedback a .us domain is negative in SEO and in conversion (customers don't like it). We are thinking of changing domain to: www.dscvrd.com.
Local Website Optimization | | Discovered
Any insights on the impact on our rankings (if any) if we do this? 2: we are focusing our SEO global / USA first but conversions in UK are better. We currently do not have multi-language SEO setup. What would the impact be of implementing www.discovered.co.uk on SEO in UK? Thanks! Gijsbert0 -
Multiple Websites for a Large Home Service Company
I have a client who offers multiple services, the current website is already huge because they have added on so many new offerings in the last year and want everything above the fold. As I am building out the sitemap for a re-design, they continue to add more services. (HVAC, Plumbing, Solar, Windows, Electrical) I am working on a sitemap for a re-build, but I am still well over 100 pages deep with huge menu's. **My question is what are the SEO pros/cons of breaking the site up into multiple websites? **
Local Website Optimization | | Lauren_E2 -
Same blog, multiple languages. Got SEO concerns.
Hi, My company runs a small blog in swedish. Most of the visitors are our customers/prospects. We will write about generic concepts regarding our business and the occasional company news story. However, I have quite a few ideas for articles that could be interesting to a lot of people, and I'm tempted to write those in english for better exposure. I would love it if that exposure could boost my companies authority. How should I go on about this? Can I somehow tell search engines that a certain part or page of the site is in another language? Should I translate our entire site to english and post the english post in a separate blog feed? Any insight is welcome. Thanks in advance!
Local Website Optimization | | Mest0 -
Does building multiple websites hurt you seo wise? Good or bad strategy?
HI,rategy. So I spoke to a local Colorado seo company and they suggested to find whatever keywords is the most searched under my GWT's and put .com behind it and build other sites for other keywords. I was curious about this type of strategy. Does this work? This seo guy said I could just get a DBA bank account and such for each domain name etc. I am not wanting to mislead anyone, but I am curious if for the sake of promoting other services, if creating other websites with partial and EMD's are worthwhile? Another issue I worry about is if I put my companies phone number, then next thing you know there is 3 or 4 sites that use that same phone number. To me this does not build trust with Google. But being I am learning, maybe this is a common strategy, or doomed from the start. Just curious what you think. Would you build other sites to try and rank for other services? Or keep one sites and maximize it? Thank you for your thoughts. I just do not want to pay $3000 per site if it will hurt not help.
Local Website Optimization | | Berner0