Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
-
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install.
Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets.
The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.comShould we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com?
Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution?
Thank you so much for the help.
-
Hi Adam,
Are the ticket pages on the sub domain the same as the event pages on the main domain except with the ticketing system included? If yes it would make more sense to canonical each event ticketing page back to the same event page so: tickets.domain.com/event1 -> domain.com/event1.
If the ticketing pages are not meant to be indexed at all then I would put the robots no index tag on them also (or a robots file on the whole subdomain) and keep an eye on GWT to make sure none of them creep in. Canonical tags are a 'recommendation' not a rule so if your plans are for these pages to not be indexed at all best to ensure that as completely as possible.
-
Hey Leo!
Thanks for the taking the time to answer me. I am going to set this up exactly as you recommend.
1. I will install the same GA code from domain.com on tickets.domain.com
2. Do you think I need to set the canonical URL on the various ticketing pages all back to the main domain?
e.g. tickets.domain.com ---> canonical to domain.com
e.g. tickets.domain.com/event1 ---> canonical to domain.com
e.g. tickets.domain.com/event2 ---> canonical to domain.com
e.g. tickets.domain.com/event3 ---> canonical to domain.com
and so on?3. We did make all the header links of tickets.domain.com point straight back to their counterpart on domain.com.
Does this seem like I basically got it all correct?
Thanks again
Adam -
HI,
If technically that is the best solution for your needs then a couple of things to keep in mind:
1. If you are using Universal Analytics subdomain tracking is included by default so if you put the same analytics code on your subdomain pages then you should not be seeing any 'bounces' - google should be able to figure this out also.
2. You can install GWT for the subdomain also. I dont think you can set the preferred domain for a subdomain setup but you can use GWT to monitor issues and make sure that duplicate pages for the subdomain are not getting indexed.
3. To avoid indexing of the subdomain pages (which I assume you don't want) you could canonical them to their equivalent on the www domain. You could also meta robots no-index them all. If they creep in anyway you can use GWT to remove them.
If the subdomain is a complete clone and the experience is seamless then why not make all links on the subdomain go back to the www domain pages. That way the only pages available on the subdomain would be the ticketing pages and the rest would be on the www as normal.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
SEO Best Practice for Managing a Businesses NAP with Multiple Addresses
I have a client with multiple business addresses - 3 across 3 states, from an SEO perspective what would be the best approach for displaying a NAP on the website? So far I've read that its best: to get 3 GMB account to point to 3 location pages & use a local phone number as opposed to a 1300 number. Display all 3 locations in the footer, run of site
Local Website Optimization | | jasongmcmahon1 -
Research on industries that are most competitive for SEO?
I am trying to see if there is a reputable / research-backed source that can show which industries are most competitive for search engine optimization. In particularly, I'd be interested in reports / research related to the residential real estate industry, which I believe based on anecdotal experience to be extremely competitive.
Local Website Optimization | | Kevin_P3 -
Dual website strategy
We have two websites (different businesses) in the technology sector that sell the same products on the same platform (OSC) but have different branding. We have tried to make the static content different and the user generated content is different. SEO as largely different. But the one site has much better rankings than the other. Whilst the under performing site is not responsive yet, I need to decide whether to merge the two businesses into one or continue on the two separate websites approach. I would only pursue the latter approach and invest further time and effort into this under performing website if I knew I was "on the right" track. My SEO knowledge is not extensive and so I would be interested in any views the community has? I note that kogan.com.au and dicksmith.com.au have a similar dual website approach (same company) and they are both major brands in Australia. I thank you in advance for any thoughts you may have.
Local Website Optimization | | Alpine91 -
Using geolocation for dynamic content - what's the best practice for SEO?
Hello We sell a product globally but I want to use different keywords to describe the product based on location. For this example let’s say in USA the product is a "bathrobe" and in Canada it’s a "housecoat" (same product, just different name). What this means… I want to show "bathrobe" content in USA (lots of global searches) and "housecoat" in Canada (less searches). I know I can show the content using a geolocation plugin (also found a caching plugin which will get around the issue of people seeing cached versions), using JavaScript or html5. I want a solution which enables someone in Canada searching for "bathrobe" to be able to find our site through Google search though too. I want to rank for "bathrobe" in BOTH USA and Canada. I have read articles which say Google can read the dynamic content in JavaScript, as well as the geolocation plugin. However the plugins suggest Google crawls the content based on location too. I don’t know about JavaScript. Another option is having two separate pages (one for “bathrobe” and one for “housecoat”) and using geolocation for the main menu (if they find the other page i.e. bathrobe page through a Canadian search, they will still see it though). This may have an SEO impact splitting the traffic though. Any suggestions or recommendations on what to do?? What do other websites do? I’m a bit stuck. Thank you so much! Laura Ps. I don’t think we have enough traffic to add subdomains or subdirectories.
Local Website Optimization | | LauraFalls0 -
How to correctly move subdomain to subfolder (google webmaster)?
Hello, This is my first post in here 🙂 I just wondered what is the correct way to move a subdomain to subfolder? I've moved it, re-done sitemap, so that main website would include a subfolder, as they are part of one big website now (it was something like a blog on a subdomain). Subdomain now does correct 301 redirects. Submitted new sitemap to google, asked google to re-fetch the whole domain (thus subfolder should be re-fetched too, as it's part of main nav). The areas i'm in doubt: I can tell google that the domain got moved, however it is moved to the one that is already approved in the same account, but is in a subfolder, so should i do this? Or should i simply somehow erase it on webmaster? The blog was launched about a month ago, and it isn't perfectly optimized yet, it wasn't on google SERPs pretty much at all, excluding googling it straightly, and there are pretty much 0 traffic from google, almost all of it is either direct either referral, mostly social, Thanks, Pavel
Local Website Optimization | | PavelGro920 -
Yoast Local SEO Reviews/Would it work for me?
Hi everyone, I'm looking for some feedback on Yoast Local SEO, and if you think it'd work for our site. www.kempruge.com. Our site is a wordpress site, and there's nothing about it, off the top of my head, that makes me think it wouldn't work, but I've been wrong before. We do use All-In-One SEO, not the Yoast plugin, so I'm not sure if that's compatible.or would cause a problem? (The reason we use All-In-One and not Yoast is because that's what we had when I got here, and I'm worried what would happen if we switched). Also, we have three offices, and I need to be able to do local seo for all three. I know Yoast says it supports multiple offices, but I'd feel more comfortable if someone on here let me know from his/her experience that it did. Anything else you want to add about Yoast Local, I'm all ears! Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0