Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
-
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install.
Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets.
The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.comShould we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com?
Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution?
Thank you so much for the help.
-
Hi Adam,
Are the ticket pages on the sub domain the same as the event pages on the main domain except with the ticketing system included? If yes it would make more sense to canonical each event ticketing page back to the same event page so: tickets.domain.com/event1 -> domain.com/event1.
If the ticketing pages are not meant to be indexed at all then I would put the robots no index tag on them also (or a robots file on the whole subdomain) and keep an eye on GWT to make sure none of them creep in. Canonical tags are a 'recommendation' not a rule so if your plans are for these pages to not be indexed at all best to ensure that as completely as possible.
-
Hey Leo!
Thanks for the taking the time to answer me. I am going to set this up exactly as you recommend.
1. I will install the same GA code from domain.com on tickets.domain.com
2. Do you think I need to set the canonical URL on the various ticketing pages all back to the main domain?
e.g. tickets.domain.com ---> canonical to domain.com
e.g. tickets.domain.com/event1 ---> canonical to domain.com
e.g. tickets.domain.com/event2 ---> canonical to domain.com
e.g. tickets.domain.com/event3 ---> canonical to domain.com
and so on?3. We did make all the header links of tickets.domain.com point straight back to their counterpart on domain.com.
Does this seem like I basically got it all correct?
Thanks again
Adam -
HI,
If technically that is the best solution for your needs then a couple of things to keep in mind:
1. If you are using Universal Analytics subdomain tracking is included by default so if you put the same analytics code on your subdomain pages then you should not be seeing any 'bounces' - google should be able to figure this out also.
2. You can install GWT for the subdomain also. I dont think you can set the preferred domain for a subdomain setup but you can use GWT to monitor issues and make sure that duplicate pages for the subdomain are not getting indexed.
3. To avoid indexing of the subdomain pages (which I assume you don't want) you could canonical them to their equivalent on the www domain. You could also meta robots no-index them all. If they creep in anyway you can use GWT to remove them.
If the subdomain is a complete clone and the experience is seamless then why not make all links on the subdomain go back to the www domain pages. That way the only pages available on the subdomain would be the ticketing pages and the rest would be on the www as normal.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO - How do I show correct SERP results in the UK and US?
Hi, Moz community. I hope you’re all OK and keeping busy during this difficult period. I have a few questions about international SEO, specifically when it comes to ranking pages in the UK and the US simultaneously. We currently have 2 websites set-up which are aimed towards their respective countries. We have a ‘.com’ and a ‘.com/us’. If anybody could help with the issues below, I would be very grateful. Thank you all. Issues When looking in US Google search with a VPN, the title tag for our UK page appears in the SERP e.g. I will see: UK [Product Name] | [Brand] When checking the Google cache, the UK page version also appears This can cause a problem especially when I am creating title tags and meta descriptions that are unique from the UK versions However, when clicking through from the SERP link to the actual page, the US page appears as it should do. I find this very bizarre that it seems to show you the US page when you click through, but you see the UK version in the SERP when looking in the overall search results. Current Set-Up Our UK and US page content is often very similar across our “.com” and “.com/us” websites and our US pages are canonicalised to their UK page versions to remove potential penalisation We have also added herflang to our UK and US pages Query How do I show our US SERP as opposed to the UK version in US Google search? My Theories/ Answers US page versions have to be completely unique with content related to US search intent and be indexed separately - therefore no longer canonicalised to UK version Ensure hreflang is enabled to point Google to correct local page versions Ensure local backlinks point to localised pages If anyone can help, it will be much appreciated. Many thanks all.
Local Website Optimization | | Katarina-Borovska0 -
Local SEO Over Optimization
We are targeting a bunch of services for our local business that works in and around their location. I'm concerned about over optimization and need some guidance on whether these points should be resolved. The company is based in a city and works mostly in the city but also in the surrounding areas. Currently, the site has 6 services pages (accessible via main nav) targeting the same location i.e. “Made Up Service London”, “Imaginary Service London” (with URLs and H1 tags etc. in place containing this location). However this is soon going to become 9 services pages, I am concerned that the repetition of this one location is starting to look spammy, especially as its where the company is based. Initially, I also wanted pages targeting the same services in other nearby areas. For example “Made Up Service Surrey”, “Imaginary Service Essex”. This has not happened as the info available has been too sporadic. I was going to add links to relevant case studies into these pages to beef up the content and add interest. To that end, we came up with case studies, but after a while, I noticed that these are also largely focused on the primary location. So out of 32 case studies, we have 19 focused on the primary location again with URL’s and H1 tags etc containing the location keyword. So in total, we have 25 pages optimized for the location (soon to be 28 and more if further case studies are added). My initial feeling was that the inclusion of pages targeting services in other locations would legitimize what we have done with the main pages. But obviously we have not got these pages in place and I question whether we ever will. What is my best course of action moving forward?
Local Website Optimization | | GrouchyKids1 -
Is there any way to report a website that is not complying with webmaster guidelines to Google?
Like how we can "suggest an edit" in Google Business Listings, is there any way to report Google about the webmaster guidelines violation?
Local Website Optimization | | Alagurajeshwaran0 -
Is CNAME / URL flattening a bad practice?
I recently have moved a number of websites top a new server and have made the use of CNAME / URL flattening (I believe these are the same?). A network admin had said this is an unrecommended practice. From what I have read it seems flattening can be beneficial for site speed and SEO even if very little.
Local Website Optimization | | Dissident_SLC0 -
How to correctly move subdomain to subfolder (google webmaster)?
Hello, This is my first post in here 🙂 I just wondered what is the correct way to move a subdomain to subfolder? I've moved it, re-done sitemap, so that main website would include a subfolder, as they are part of one big website now (it was something like a blog on a subdomain). Subdomain now does correct 301 redirects. Submitted new sitemap to google, asked google to re-fetch the whole domain (thus subfolder should be re-fetched too, as it's part of main nav). The areas i'm in doubt: I can tell google that the domain got moved, however it is moved to the one that is already approved in the same account, but is in a subfolder, so should i do this? Or should i simply somehow erase it on webmaster? The blog was launched about a month ago, and it isn't perfectly optimized yet, it wasn't on google SERPs pretty much at all, excluding googling it straightly, and there are pretty much 0 traffic from google, almost all of it is either direct either referral, mostly social, Thanks, Pavel
Local Website Optimization | | PavelGro920 -
Call Tracking, DNI Script & Local SEO
Hi Moz! I've been reading about this a lot more lately - and it doesn't seem like there's exactly a method that Google (or other search engines) would consider to be "best practices". The closest I've come to getting some clarity are these Blumenthals articles - http://blumenthals.com/blog/2013/05/14/a-guide-to-call-tracking-and-local/ & the follow-up piece from CallRail - http://blumenthals.com/blog/2014/11/25/guide-to-using-call-tracking-for-local-search/. Assuming a similar goal of using an existing phone number with a solid foundation in the local search ecosystem, and to create the ability to track how many calls are coming organically (not PPC or other paid platform) to the business directly from the website for an average SMB. For now, let's also assume we're also not interested in screening the calls, or evaluating customer interaction with the staff - I would love to hear from anyone who has implemented the DNI call tracking info for a website. Were there negative effects on Local SEO? Did the value of the information (# of calls/month) outweigh any local search conflicts? If I was deploying this today, it seems like the blueprint for including DNI script, while mitigating risk for losing local search visibility might go something like this: Hire reputable call-tracking service, ensure DNI will match geographic area-code & be "clean" numbers Insert DNI script on key pages on site Maintain original phone number (non-DNI) on footer, within Schema & on Contact page of the site ?? Profit Ok, those last 2 bullet points aren't as important, but I would be curious where other marketers land on this issue, as I think there's not a general consensus at this point. Thanks everyone!
Local Website Optimization | | Etna1 -
How Google's Doorway Pages Update Affects Local SEO
Hey Awesome Local Folks! I thought I'd take a proactive stance and start a thread on the new doorway pages update from Google, as I feel there will be questions coming up about this here in the forum: Here's the update announcement: http://googlewebmastercentral.blogspot.com/2015/03/an-update-on-doorway-pages.html And here's the part that will make local business owners and Local SEOs take a second glance at this: Here are questions to ask of pages that could be seen as doorway pages: Do the pages duplicate useful aggregations of items (locations, products, etc.) that already exist on the site for the purpose of capturing more search traffic? I think this will naturally lead to questions about the practice of creating local/city landing pages. At this point, my prediction is that this will come down to high quality vs. crummy quality pages of this type. In fact, after chatting briefly with Andrew Shotland, I'm leaning a bit toward seeing the above language as being strongly geared toward directory type sites and large franchises. I recommend reading Andrew's post about his take on this, as I think he's on the right track: http://www.localseoguide.com/googles-about-to-close-your-local-doorway-pages/ So, I'm feeling at this point that if you've made the right efforts to develop unique, high quality local landing pages, you should be good unless you are an accidental casualty of an over-zealous update. We'll see! If anyone has thoughts to contribute on this thread, I hope they will, and if lots of questions start coming up about this here in the community, feel free to link back to this thread in helping your fellow community members 🙂 Thanks, all!
Local Website Optimization | | MiriamEllis9 -
Local SEO: City & County Pages
I'm working on developing some local pages for an HVAC company. They cover two counties, so I was planning on having two county pages, then linking them to individual city pages to keep the menu simpler and not cluttering it up with a couple dozen city pages for people to slog through. Has anybody ever done county pages before for local SEO? Or at least seen them? Just curious to see if there's any real benefit overall for have separate county pages, or if I should just stick to city pages.
Local Website Optimization | | ChaseMG0