Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Web-site Structure/ SEO Strategy for an online travel agency?
-
Dear Experts!
I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content.
In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content).
From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are:
- all properties (includes all property types and locations),
- a page for each location (includes all property types).
Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords...
Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property?
If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible?
Your feedback will be highly appreciated!
Thank you!
Dmitry
-
For an online travel agency, a robust website structure and SEO strategy are vital. Implement a user-friendly interface with intuitive navigation, making it easy for visitors to search and book travel options. Optimize website content with relevant keywords, meta tags, and descriptive URLs to improve search engine visibility. Incorporate high-quality images, engaging travel guides, and customer reviews to enhance user experience and encourage longer site engagement. Utilize responsive design for seamless browsing across devices, and prioritize mobile optimization for on-the-go travelers.
-
For an online travel agency, a robust website structure and SEO strategy are vital. Implement a user-friendly interface with intuitive navigation, making it easy for visitors to search and book travel options. Optimize website content with relevant keywords, meta tags, and descriptive URLs to improve search engine visibility. Incorporate high-quality images, engaging travel guides, and customer reviews to enhance user experience and encourage longer site engagement. Utilize responsive design for seamless browsing across devices, and prioritize mobile optimization for on-the-go travelers.
-
This structure and strategy will help your online travel agency stand out in a competitive market and provide a superior experience for your customers.
Below is a response to the query about the best website architecture, SEO strategies and tactics for an electronic tourism company:
- Website Structure:
• Homepage: Introduce yourself and your agency, express your main services and provide easily navigable site.
• Destinations: Each separate page is meant for the travel destination you provide with rich content and appealing pictures.
• Tours/Packages: Make sure there is a specific tour/package section with prices, programs of trips and their status.
• For booking or contacting us, provide an interactive booking form that is easy to use together with several contact options(phone, chat, and email) to enable for queries or help from customers.
• The blog should have few words but should contain travelling tips, tourist information from selected destination points and what is happening at your agency, hence it should engage every visitor to improve Search Engine Optimization (SEO).
• Trust is created by showing customer reviews and testimonials giving them confidence in your services.
• In order to enrich the lives of your clients by forming friendships as they worked with us personally, you need to tell them more about yourself as well as where you have come from in terms of history and what you want us accomplish as a group.- SEO Strategy:
• Key phrase method is the first thing you must do in SEO. Start by recognizing all those phrases travelers are using to search for travel services or products.
• Optimize title tags, meta descriptions, headers, and alt text for images with targeted keywords to assist in on-page optimization.
• Your blog should always have high-quality and informative content that incorporates target keywords for effective content creation.
• Strategic linking between pages and posts on your site will improve navigation as well as search engine optimization.
• To boost your site's authority, you should generate quality backlinks from renowned travel-related websites.
• For better user experience and ranking purposes, you must optimize your website for mobile devices.In a competitive marketplace, this structure and strategy can make your online travel site unique while at the same time offering an enhanced experience for clients.
Yours truly,
[Sanskar Gupta]
B Two Holidays -
Hello,
We at Donutz Digital digital marketing agency have some travel niche clients, so I believe we can help you or others in a similar situation.
Why don't you send us an inquiry directly, and we will answer asap with possible options and maybe an offer so you could have your hands free on similar technical tasks and focus and the ones you feel more comfortable with?
-
I am working on my website(https://www.dejourneys.com) , and found that some of the websites like yelp.com and other similar ones required a USA number and address.
How can I get a strong link from those websites and are there any other ones that can help me get strong backlinks for my travel agency?
Regards,
Raheel. -
Hi,
Cool question! I previously ran a startup that was essentially an aggregator, something similar to an OTA, but we were aggregating classes instead of properties/homestays. I found that the best way to structure the site was some thing like this:
1. Home (Targeting the biggest, baddest keyword you can find)
https://qualistay.com/1.2 Category pages
Broad keywords in each category (in your case, 'tenerife south apartments for rent' etc)
You currently have this as https://qualistay.com/properties/tenerife/
I'd have gone with creating multiple 'category' pages like
https://qualistay.com/tenerife-south/apartments
https://qualistay.com/tenerife-south/villas
https://qualistay.com/tenerife-north/apartments
https://qualistay.com/tenerife-north/villas1.2.1 Sub-Category pages
Still relatively broad, but more specific keywords
You didn't choose to sub-categorize these pages even more, but here's what I would have done:
https://qualistay.com/tenerife-south/apartments/adeje
https://qualistay.com/tenerife-south/villas/adeje
https://qualistay.com/tenerife-south/apartments/arico
https://qualistay.com/tenerife-south/villas/arico
https://qualistay.com/tenerife-south/apartments/granadilla-de-abona1.2.1.1 Property pages
Specific keywords
https://qualistay.com/tenerife-south/villas/playa-de-las-americas/villa-victoria
These pages would tend be targeting the so-called 'brand keyword' of each individual property.Structuring your site this was enables you to include the targeted keywords in your URLs and enables you to rank almost every single page efficiently based purely on the location of each property. In this manner, you would be able to rank for the top tier keywords which I'm guessing is 'tenerife villas' and 'tenerife apartments', the 2nd tier keywords which would be 'tenerife south villas for rent', 'tenerife south apartments for rent' and the 3rd tier keywords which would be 'playa de las americas villas for rent'. You also get the benefit of ranking for each individual property's 'brand name' like 'villa victora tenerife south'.
If the property happens to fall on the same building, then you can sub-categorize it even further like
https://qualistay.com/tenerife-south/villas/playa-de-las-americas/villa-victoria/level-1
https://qualistay.com/tenerife-south/villas/playa-de-las-americas/villa-victoria/level-2Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tools/Software that can crawl all image URLs in a site
Excluding Screaming Frog, what other tools/software to use in order to crawl all image URLs in a site? Because in Screaming Frog, they don't crawl image URLs which are not under the site domain. Example of an image URL outside the client site: http://cdn.shopify.com/images/this-is-just-a-sample.png If the client is: http://www.example.com, Screaming Frog only crawls images under it like, http://www.example.com/images/this-is-just-a-sample.png
Technical SEO | | jayoliverwright0 -
How can I block incoming links from a bad web site ?
Hello all, We got a new client recently who had a warning from Google Webmasters tools for manual soft penalty. I did a lot of search and I found out one particular site that sounds roughly 100k links to one page and has been potentialy a high risk site. I wish to block those links from coming in to my site but their webmaster is nowhere to be seen and I do not want to use the disavow tool. Is there a way I can use code to our htaccess file or any other method? Would appreciate anyone's immediate response. Kind Regards
Technical SEO | | artdivision0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Best URL-structure for ecommerce store?
What structure will recommend to the product pages? Lets make an example with the keyword "Luxim FZ200" With category in url:
Technical SEO | | gojesper
www.myelectronicshop.com/digital-cameras/luxim-FZ200.html With /product prefix:
www.myelectronicshop.com/product/luxim-FZ200.html Without category in url:
www.myelectronicshop.com/luxim-FZ200.html I have read in a blog post that Paddy Moogan recommend /lluxim-FZ200.html - i think i prefer this version too. But I can see that many of the bigger ecommerce stores are using a /product prefix before the product name. What is the reason for this? and what is best practice?0 -
Does posting an article on multiple sites hurt seo?
A client of mine creates thought leadership articles and pitches multiple sites to host the article on their site to reach different audiences. The sites that pick it up are places such as AdAge and MarketingProfs and we do get link juice from these sources most of the time. Does having the same article on these sites as well as your own hurt your SEO efforts in any way? Could it be recognized as duplicate content? I know the links are great just wondering if there is any other side effects especially when there are no links provided! Thank you!
Technical SEO | | Scratch_MM0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0 -
Best geotargeting strategy: Subdomains or subfolders or country specific domain
How have the relatively recent changes in how G perceives subdomains changed the best route to onsite geotargeting i.e. not building out new country specific sites on country specific and hosted domains and instead developing sub-domains or sub-folders and geo-targeting those via webmaster tools ? In other words, given the recent change in G perception, are sub-domains now a better option than a sub-folder or is there not much in it ? Also if client has a .co.uk and they want to geo-target say France, is the sub-domain/sub-folder route still an option or is the .co.uk still too UK specific, and these options would only work using a .com ? In other words can sites on country specific domains (.co.uk , .fr, .de etc etc) use sub-folders or domains to geo-target other countries or do they have no option other than to develop new country specific (domains/hosting/language) websites ? Any thoughts regarding current best practice in this regard much appreciated. I have seen last Febs WBF which covers geotargeting in depth but the way google perceives subdomains has changed since then Many Thanks Dan
Technical SEO | | Dan-Lawrence0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0