Which One Would You Suggest Me in Terms of Internalization?
-
Hi Friends,
This is my website http://goo.gl/fYndv. As of now, we have only one domain and we have contents in both English & Arabic. Arabic is translated content from English. So, we use alternate tags to indicate Google about that. We mostly receive traffic from Saudi Arabia because we are based out there. Now, we are planning to target major countries like India, Australia & So on. We know like creating sub-folders over sub-domains would be good like example.com/in/ over in.exmaple.com.
But we are not going to change any contents only currency gets changed in those geo-graphic sub-domains or sub-folders. I just want to know, since I am not going to change the contents will it be good if I go with sub-folder like example.com/in. Is there any chance for Google penalization?
-
I'm a believer in subfolder > subdomain
-
Prabhu, based on the information you have provided I would say going with subdomains are going to hurt you. Because you are displaying the same content on two different URLs and the only change is a currency.
No matter if you are attacked by Panda, I will still go with the idea suggested by Oleg.
Hope this helps!
-
Hi Oleg,
Thanks for your valuable reply. My website is affected by Panda 4.2. So, should I go with Sub-folder or Sub-domain in this case.
-
In theory, no. an ecommence store can have en-us and en-uk as hreflangs, which would have the same content but different currency - totally acceptable.
Sounds like you are in the same scenario.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Suggestions on Link Auditing a 70,000 URL list?
I have a website with nearly 70,000 incoming links, since its a somewhat large site that has been online for 19 years. The rate I was quoted for a link audit from a reputable SEO professional was $2 per, and clearly I don't have $140,000 to spend on a link audit 🙂 !! I was thinking of asking you guys for a tutorial that is the Gold Standard for link auditing checklists - and do it myself. But then I thought maybe its easier to shorten the list by knocking out all the "obviously good" links first. My only concern is that I be 100% certain they are good links. Is there an "easiest approach" to take for shortening this list, so I can give it to a professional to handle the rest?
Intermediate & Advanced SEO | | HLTalk0 -
How does one submit data to the Knowledge Graph?
I'm working with a very reputable open source civic data compiler who'd like to give their data to the knowledge graph for it to be used. Does anybody know where I should start with this? Or, do you think it's possible to e-mail Google and ask to be included in the Knowledge Graph? The company that owns this compiler will likely have connections to them. Thanks!
Intermediate & Advanced SEO | | Edward_Sturm0 -
Linking to one of my own sites, from my site
Hi experts, I own a site for castingjobs (Site1) and a site for selling paintings (Site2). In a long time, I've had a link at the bottom of Site1, linking to Site 2. (Basicaly: Partnerlink: Link site 2). Site1 is for me the the only important site, since it's where Im making my monthly revenue. I added the link like 5 years ago or so, to try to boost site 2. My question is:
Intermediate & Advanced SEO | | KasperGJ
1. Is it somehow bad for SEO for site 1, since the two sites have nothing to do with each other, they are basically just owned by me.
2. Would it make sense to link from Site 2 to Site 1 indstead?0 -
Targeting two search terms with same intent - one or more pages for SEO benefits?
I'd like some professional opinions on this topic. I'm looking after the SEO for my friends site, and there are two main search terms we are looking to boost in search engines. The company sells Billboard advertising space to businesses in the UK. Here are the two search terms we're looking to target: Billboard Advertising - 880 searches P/M Outdoor Advertising - 720 searches P/M It would usually make sense to make a separate page to target the keyword "billboard advertising" on its own fully optimised landing page with more information on the topic and with a targeted URL: www.website.com/billboard-advertising/ and the homepage to target "outdoor advertising" as it's an outdoor advertising agency. But there's a problem, as both search terms are highly related and have the same intent, I'm worried that if we create a separate page to target the billboard advertising, it will conflict with the homepage targeting outdoor advertising. Also, the main competitors who are currently ranked position 1-3, are ranking with their home pages and not optimised landing pages to target the exact search term "billboard advertising". Any advice on this?
Intermediate & Advanced SEO | | Jseddon920 -
HTTPS entire domain Vs. one URL
Long time no Moz! Ive been away with some server related issues, installing an AD at the company I work for, but I'm back. Our SSL cert just expired and I'm trying to determine the pros and cons of making an entire site SSL vs just the URL. Our previous set up was just a single domain. I know Google has hinted toward SSL preference, and I know its a little early to know for certain how much that's going to help, but I just wanted to know what everybody thought? It expired yesterday, so I have to do something. And we lost our previous credentials so I can't just renew the old one. Thanks!
Intermediate & Advanced SEO | | HashtagHustler0 -
Many changes carried out - Caused a rankings dip one week on?
Hi all! Last week various changes were carried out on a website which has large traffic daily.
Intermediate & Advanced SEO | | Whittie
The changes included; Re-structure of product meta titles ~500 Re-structure of product meta description ~500 Canonical link included on the majority of pages ~600 Meta details have only been changed mildly Title has gone from
Name | Product Code | Category | Company Name
To
Name | Category | Collection | Company Name
IF the title is shorter than 62 characters, the price gets added in.
Name | Category | Collection | Price | Company Name Descriptions have been cut shortIF description is over 156 characters the word on the limit is cut off and replaced with ... Canonical LinksReasoning for adding a canonical link for each product was to point Google in the right direction. Upper and Lower case within the URL are reachable .aspx / .html at the end of each page Canonical link added in the meta directing to a lower case version of the URL. Rankings have been reduced across various keywords, however a few have had a positive impact. Would you say this is a dip before a mild rise, or something I've done is harming the rankings and needs to be reversed? I'd appreciate any advice given. Many Thanks0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1