Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best URL structure for SEO for Malaysian/Singapore site on .com.au domain
-
Hi there
I know ideally i need a .my or .sg domain, however i dont have time to do this in the interim so what would be the best way to host Malaysian content on a www.domainname.com.au website?
www.domainname.com.au/en-MY
www.domainname.com.au/MY
domainname.com.au/malaysia
malaysia.domainname.com.au
my.domainname.com.auIm assuming this cant make the .com.au site look spammy but thought I'd ask just to be safe?
Thanks in advance!
-
Google has stated they are better now at relating subdomain content to the TLD domain, but you're probably still better off using a subfolder. If you do go with a subdomain, make sure to link them in your GA code for better reporting.
There are so many languages spoken in Malaysia that this domain issue is really not going to help visitors on your site. If you're going for consistency, you'd need subfolders or subdomains for all supported languages. If you're only using English, then I wouldn't even consider this change. Simply use www.domainname.com.au/malaysia/ to host content relevant to this market until you can get your TLD. This would send the strongest signal to both search engines and site visitors that your content is targeted for Malaysia.
-
I'd choose one of the top 3 options listed as subdirectories are going to better associated with the root domain than subdomains. Moz has done several tests of this with one of their latest recaps here: http://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday. From Rand's WBF:
You're asking, "Should I put my content on a subdomain, or should I put it in a subfolder?" Subdomains can be kind of interesting sometimes because there's a lot less technical hurdles a lot of the time. You don't need to get your engineering staff or development staff involved in putting those on there. From a technical operations perspective, some things might be easier, but from an SEO perspective this can be very dangerous. I'll show you what I mean.
So let's say you've got blog.yoursite.com or you've got www.yoursite.com/blog. Now engines may indeed consider content that's on this separate subdomain to be the same as the content that's on here, and so all of the links, all of the user and usage data signals, all of the ranking signals as an entirety that point here may benefit this site as well as benefiting this subdomain. The keyword there is "may."
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
My indexed site URL removed from google search without get any message or Manual Actions???
On Agust 2 or 3.. I'm not sure about the exact date...
White Hat / Black Hat SEO | | newwaves
The main URL of my website https://new-waves.net/ had been completely removed from Google search results! without getting any messages or Manual Actions on search console ?? but I'm still can find some of my site subpages in search results and on Google local maps results when I tried to check it on google
info:new-waves.net >> no results
site:new-waves.net >> only now I can see the main URL in results because I had submitted it again and again to google but it might be deleted again today or tomorrow as that happen before last few days
100% of all ranked keywords >> my site URL new-waves.net had been completely removed from all results! but I'm still can see it on maps on some results I never get any penalties to my site on Google search console. I noticed some drops on some keywords before that happens (in June and July) but it all of it was related to web design keywords for local Qatar, but all other keywords that related to SEO and digital marketing were not have any changes and been on top My site was ranked number 1 on google search results for "digital marketing qatar" and some other keywords, but the main URL had been removed from 100% of all search results. but you can still see it on the map only. I just tried to submit it again to Google and to index it through google search console tool but still not get any results, Recently, based on google console, I found some new links but I have no idea how it been added to links of my website:
essay-writing-hub.com - 9,710
tiverton-market.co.uk - 252
facianohaircare.com - 48
prothemes.biz - 44
worldone.pw - 2
slashdot.org - 1
onwebmarketing.com - 1 the problem is that all my high PR real links deleted from google console as well although it still have my site link and it could be recognized by MOZ and other sites! Can any one help to know what is the reason?? and how can I solve this issue without losing my previous ranked keywords? Can I submit a direct message to google support or customer service to know the reason or get help on this issue? Thanks & Regards0 -
Sub Domain rel=canonical to Main Domain
Just a quick one, i have the following example scenario. Main Domain: http://www.test.com Sub Domain: http://sub.test.com What I am wondering is I can add onto the sub domain a rel=canonical to the main domain. I dont want to de-index the whole sub domain just a few pages are duplicated from the main site. Is it easier to de-index the individual sub domain pages or add the rel=canonical back to the main domain. Much appreciated Joseph
White Hat / Black Hat SEO | | Joseph-Vodafone0 -
How to 301 redirect from old domain and their pages to new domain and pages?
Hi i am a real newbie to this and i hope for a guide on how to do this. I seen a few moz post and is quiet confusing hopefully somebody able to explain it in layman terms to me. I would like to 301 redirect this way, both website contain the same niche. oldwebsite.com > newwebsite.com and also its pages..... oldwebsite.com/test >newwebsite.com/test So my question here is i would like to host my old domain and its pages in my new website hosting in order to redirect to my new domain and its pages how do i do that? would my previous page link overwrite my new page link? or it add on the juice link? Do i need to host the whole old domain website into my new hosting in order to redirect the old pages? really confusing here, thanks!
White Hat / Black Hat SEO | | andzon0 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Unique page URLs and SEO titles
www.heartwavemedia.com / Wordpress / All in One SEO pack I understand Google values unique titles and content but I'm unclear as to the difference between changing the page url slug and the seo title. For example: I have an about page with the url "www.heartwavemedia.com/about" and the SEO title San Francisco Video Production | Heartwave Media | About I've noticed some of my competitors using url structures more like "www.competitor.com/san-francisco-video-production-about" Would it be wise to follow their lead? Will my landing page rank higher if each subsequent page uses similar keyword packed, long tail url? Or is that considered black hat? If advisable, would a url structure that includes "san-francisco-video-production-_____" be seen as being to similar even if it varies by one word at the end? Furthermore, will I be penalized for using similar SEO descriptions ie. "San Francisco Video Production | Heartwave Media | Portfolio" and San Francisco Video Production | Heartwave Media | Contact" or is the difference of one word "portfolio" and "contact" sufficient to read as unique? Finally...am I making any sense? Any and all thoughts appreciated...
White Hat / Black Hat SEO | | keeot0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Title Tag - Best Practices
I'm pretty new to seo but think I'm starting to get a decent grasp on it. One thing I'm really struggling with is how to organize the meta title tags on my website. I work in real estate and I'm noticing a lot of my local competitors that are ranking for the top keywords seem to using that particular keyword on every title tag within their website. An example would be www.paranych.com. Many of his internal pages have the word "Edmonton Real Estate" in the meta title tag, yet his home page is the page that is ranking for that particular keyword. It doesn't seem logical to have every one of my pages featuring the same keyword, but there are many examples within my industry of this working. Is the best practice with meta title tags to have your keyword on every title tag of your site or just the home page? Thx, Barry
White Hat / Black Hat SEO | | patrickmilligan0