Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
-
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install.
Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets.
The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.comShould we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com?
Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution?
Thank you so much for the help.
-
Hi Adam,
Are the ticket pages on the sub domain the same as the event pages on the main domain except with the ticketing system included? If yes it would make more sense to canonical each event ticketing page back to the same event page so: tickets.domain.com/event1 -> domain.com/event1.
If the ticketing pages are not meant to be indexed at all then I would put the robots no index tag on them also (or a robots file on the whole subdomain) and keep an eye on GWT to make sure none of them creep in. Canonical tags are a 'recommendation' not a rule so if your plans are for these pages to not be indexed at all best to ensure that as completely as possible.
-
Hey Leo!
Thanks for the taking the time to answer me. I am going to set this up exactly as you recommend.
1. I will install the same GA code from domain.com on tickets.domain.com
2. Do you think I need to set the canonical URL on the various ticketing pages all back to the main domain?
e.g. tickets.domain.com ---> canonical to domain.com
e.g. tickets.domain.com/event1 ---> canonical to domain.com
e.g. tickets.domain.com/event2 ---> canonical to domain.com
e.g. tickets.domain.com/event3 ---> canonical to domain.com
and so on?3. We did make all the header links of tickets.domain.com point straight back to their counterpart on domain.com.
Does this seem like I basically got it all correct?
Thanks again
Adam -
HI,
If technically that is the best solution for your needs then a couple of things to keep in mind:
1. If you are using Universal Analytics subdomain tracking is included by default so if you put the same analytics code on your subdomain pages then you should not be seeing any 'bounces' - google should be able to figure this out also.
2. You can install GWT for the subdomain also. I dont think you can set the preferred domain for a subdomain setup but you can use GWT to monitor issues and make sure that duplicate pages for the subdomain are not getting indexed.
3. To avoid indexing of the subdomain pages (which I assume you don't want) you could canonical them to their equivalent on the www domain. You could also meta robots no-index them all. If they creep in anyway you can use GWT to remove them.
If the subdomain is a complete clone and the experience is seamless then why not make all links on the subdomain go back to the www domain pages. That way the only pages available on the subdomain would be the ticketing pages and the rest would be on the www as normal.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
Improving SEO with no blog
I have a client who understands the value of content for SEO - however getting them to provide some content has proven an impossible task. I've tried every way to make it easy for them. I've offered to come over to their office myself and see if I can just take 15 minutes of their time and record their answers to a few questions. The response is that's a great idea, we'll set up a time...and no time is ever good. So I've thought, what can I do without them? Unfortunately, their industry is so technical and so niche I'd need to have a law degree to even begin to understand exactly what they do, and as they are in law it's probably better to have no content than content with something even slightly incorrect in it. For now, all I can do is summarize and share news from a government website to their social media accounts. It's not highly effective. Their on-page SEO for the main site is completely optimized. I've placed them in every free listing I can possibly find - both industry and local sites. I have them update me on any local events, conferences and/or trade shows they attend for possible backlinks. What else can I do? I suppose I fear that if I can't provide them any additional results, they will stop seeing the value in SEO services, and I'd have a hard time disagreeing as I can't think of what else to do for them. Thanks for any help!
Local Website Optimization | | everestagency1 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
Local SEO for National Brands
Hi all, When it comes to local SEO in 2015, I appreciate that having a physical location in the town/city you wish to rank is a major factor. However, if you're a national brand is it still possible to rank for local searches when you're based in one location? The reason I ask is that, although our service is national, the nature of what we offer means that it is not inconceivable that people would search for a local variation of our top keywords. Other than the standard things - location in the content, the H1/H2s, title tag, meta description, url etc. - is there anything national businesses can do to help? Thanks in advance. John
Local Website Optimization | | NAHL-14300 -
Subdomain versus Subfolder for Local SEO
Hello Moz World, I'm wanting to know the best practices for utilizing a subdomain versus a subfolder for multi location businesses, i.e. miami.example.com vs. example.com/miami; I would think that that utilizing the subdomain would make more sense for a national organization with many differing locations, while a subfolder would make more sense for a smaller more nearby locations. I wanted to know if anyone has any a/b examples or when it should go one way or another? Thank you, Kristin Miller
Local Website Optimization | | Red_Spot_Interactive0 -
SEO Value in Switching to ".NYC" Domain?
Recently " .NYC" domains have become available for purchase to New York City based businesses. I own and operate a New York City commercial real estate firm, nyc-officespace-leader.com. New domain would be www.metro-manhattan.nyc Our existing domain has been in use for seven years.would there be an SEO benefit to transferring our site to .NYC domain? Or would a new domain kill our domain rank? Thanks, Alan
Local Website Optimization | | Kingalan10 -
SEO: .com vs .org vs .travel Domain
Hi there, I am new to MOZ Q&A and first of all I appreciate all the folks here that share their expertise and make everyone understand 'the WWW' a bit better. My question: I have been developing a 'travel guide' site for a city in the U.S. and now its time to choose the right domain name. I put a strong focus on SEO in terms of coding, site performance as well as content and to round things up I'd like to register the _best _domain name in terms of SEO. Let's suppose the city is Atlanta. I have found the following domain names that are available and I was wondering whether you guys could give me some inside on which domain name would perform best. discoveratlanta.org
Local Website Optimization | | kinimod
atlantaguide.org
atlanta.travel
atlantamag.com Looking at the Google Adwords Keyword tool the term that reaches the highest search queries is obviously "Atlanta" itself. Sites that are already ranking high are atlanta.com and atlanta.gov. So basically I am wondering whether I should aim for a new TLD like atlanta.travel or rather go with a .org domain. I had a look around and it seems that .org domains generally work well for city guides (at least a lot of such sites use .org domains). However, I have also seen a major US city that uses .travel and ranks first. On the other hand in New York, nycgo.com ranks well. Is it safe to assume that from the domain names I mentioned it really doesn't matter which one I use since it wouldn't significantly affect my ranking (good or bad)? Or would you still choose one above the other? What do you generally thing about .travel domain names (especially since they are far more expensive then the rest)? I really appreciate your response to my question! Best,
kinimod0