Alexa Rank and Linking from Article sites.
-
We are creating unique content and submitting our articles to article sites. I have some questions about the best way to go about this.
1. We are being very careful to create unique content for each submission - so we are not submitting the same article to multiple sites. Each submission is unique, so 1 article per 1 article directory.
2. When I did my research about these article sites at Alexa.com, I noticed that a lot of the article sites are ranking very well globally, but that a lot of them are #1 in Alexa for India. They are still ranked for other countries with very top ranking, for example, they may 9,000 Alexa rank in India and then 18,000 in the U.S. which is still very high.
3. We are trying to reach U.S. customers mostly, so I am wondering if we are still getting value by linking to these sites who have global reach (even though they are ranked best for India).
I would think that this is very beneficial still, but I didn't want to get the wrong kind of traffic by getting links from sites that are primarily getting their traffic from India, even though they are also getting tons of traffic from the U.S. - I am assuming this is OK because a 18,000 or 19,000 Alexa Rank in the U.S. is still excellent and I will benefit by this. But I wanted to be sure.
Feedback?
-
Why wouldn't you submit to multiple high ranking content directories sites? If a few pick you up great....
Are you worried of a dupe content penalty? Are you using C tags to avoid this?
-
Hi, James, that's what most people are telling me about Alexa. But I still have not had a single answer about the India question which was my main concern. thank you.
-
thanks for the input. I agree it's a good method. why I think there has been debate also is that people usually submit the same article to hundreds of sites or dozens of sites. Someone alerted me to the fact that it's better to write 1 unique article per site and not to submit it elsewhere. that seems to be working so far, because it's good original content.
-
Alexa is not the best way to track a website, it only looks at people who use the alexa tool bar, it is evident that many indian webmasters are using it.
I would look at OSE data for the domain, PR, how many pages and most importantly check out other articles on the site too see the pages it is going on...
-
Got ya. I misunderstood from the original post. In that case, there's nothing wrong with what you're doing, and it's a fairly popular method for building links (although there's a bit of an ongoing debate about using this method).
Best of luck!
-
I'm just looking to increase search rankings in this instance. I'm doing so by providing quality and unique content to article sites with a link back to our site.
-
Yes, it is compatible with both Firefox and Chrome. You can find it here: http://www.seomoz.org/seo-toolbar
I'm not sure we're distinguishing between traffic from the article sites and traffic from Google search. Are you looking to bring visitors directly from the article sites or are you looking to increase your search rankings?
-
Thanks for the info, Julie. I'll look into MozRank - but the question remains in terms of sites that have strong traffic from countries like India. Is it bad to get links from these sites even if they also rank in the U.S.? My experience has shown that these sites have so far helped, since they have strong presence globally, even if the majority is from India, they also seem to have strong share in other countries as well. Any thoughts on that?
-
Jeffrey,
Thanks for the suggestion. Where do I get the Mozrank extension and does it work w/ Firefox?
What is wrong with article sites if they have PR4-7 and the content submitted is unique? We are seeing tangible results so far. We also do daily blogs and content on our own site.
-
Jeffrey,
Thanks for the suggestion. Where do I get the Mozrank extension and does it work w/ Firefox?
What is wrong with article sites if they have PR4-7 and the content submitted is unique? We are seeing tangible results so far. We also do daily blogs and content on our own site.
-
I would not use Alexa as your basis for evaluating site strength -- in my experience Alexa numbers are not only wildly innacurate, they're not even useful qualitatively. For example, one site I own gets about 17k visits per day. It has an alexa rank of 33,000 in the US. Another site I work on gets about 100 visits per day. It has an alexa rank of 32,000.
The two sites are miles apart, but Alexa not only doesn't see that, but actually misjudges which is more popular. This is true again and again with Alexa rankings. I imagine the problem is with the incredibly small sample set of toolbars users, combined with the fact that there's probably some niche bias among the users.
MozRank or MozTrust are both far better metrics for the SEO benefit of a link (as is just searching for various keywords and seeing if the directory actually ranks -- which I'll bet it doesn't, being an article directory). I haven't yet seen a good 3rd party source for the actual traffic of a site.
-
Have you considered using the Mozrank instead of Alexa? This might be a better metric, plus it's easy to see if you use the MozBar extension for your browser. I definitely recommend this instead if you're attempt to obtain links for SEO value on any site (not just limited to article sites).
I'm not sure I would recommend article marketing for traffic like you're going after. Creating great content on your own site or guest posting on related industry blogs will almost certainly be a better strategy than submitting to general article sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Important pages are being 302 redirected, then 301 redirected to support language versions. Is this affecting negatively the linking juice distribution of our domain?
Hi mozzers, Prior to my arrival, in order to support and better serve the international locations and offering multiple language versions of the same content the company decided to restructure its URLs focused on locale urls. We went from
International SEO | | Ty1986
https://example.com/subfolder to https://example.com/us/en-us/new-subfolder (US)
https://example.com/ca/en-us/new-subfolder (CAN)
https://example.com/ca/fr-ca/new-subfolder (CAN)
https://example.com/de/en-us/new-subfolder (Ger)
https://example.com/de/de-de/new-subfolder (Ger) This had implications on redirecting old URLs to new ones. All important URLs such as https://example.com/subfolder were
302 redirected to https://example.com/us/en-us/subfolder and then 301 redirected to the final URL. According to the devs: If you change the translation to the page or locale, then a 302 needs to happen so you see the same version of the page in German or French, then a 301 redirect happens from the legacy URL to the new version. If the 302 redirect was skipped, then you would only be able to one version/language of that page.
For instance:
http://example.com/subfolder/state/city --> 301 redirect to {LEGACY URL]
https://example.com/subfolder/state/city --> 302 redirect to
https://example.com/en-us/subfolder/state/city --> 301 redirect to
https://example.com/us/en-us/new-subfolder/city-state [NEW URL] I am wondering if these 302s are hurting our link juice distribution or that is completely fine since they all end up as a 301 redirect? Thanks.1 -
Is it compulsory to use hreflang attribute for Multilingual site? What if I do not use such tag?
Hello Everybody, My main site - abcd.co.uk and other sites are like this se.abcd.co.uk, fr.abcd.co.uk, es.abcd.co.uk etc Now if I donot use hreflang for Multilingual site then google will consider it as subdomain or duplicate site? But content of the sites are in different language. Thanks!
International SEO | | wright3350 -
Wants to Rank Well in Multiple Countries
We have been using moz subscription from last few months & are quite happy so far. We just wanted to explore more out of it & wanted to understand how to approach the optimization for having us rank well in US. At the same time we don't wanted to lose upon what we have achieved till now. I have read through some blog posts & found that changing the international targeting in GWT can also hurt. My vision is to rank well in multiple countries but at the same time do not lose rankings in India. Any help here would be appreciated so whatever we do, we do it right.
International SEO | | fourseven0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
Multilingual Site with 2 Separate domains and hand-translated
I have 2 separate domains: .com & .jp
International SEO | | khi5
I am having a professional translator translate the English written material from .com. However, the .jp will have same pictures and videos that I have on the .com which means alt tags are in English and video titles are in English. I have some dynamic pages where I use Google Translate and those pages I place as "no index follow" to avoid duplicate issues and they are not very important pages for me any way. Question: since I am doing a proper translating - no machines involved - can I leave pages as is or should I include any format of these: ISO language codes
2) www.example/com/” /> Even though hand translated, the translation will probably be 85% similar to that if I used Google Translate. Will that potentially be seen as duplicate content or not at all since I have not used the Google Translate tool? I wonder from which angle Google analyses this. Thank you,0 -
How to replace my .co.uk site with my .com site in the US Google results
My customer and I are based in the UK. My customer's site, www.blindbolt.co.uk has been around for years. Last year we launched their American site, www.blindboltusa.com. Searching on google.com (tested both via proxy and using the gl=us querystring trick), a search for blind bolt on the US Google returns our www.blindbolt.co.uk site. We would like it to show our www.blindboltusa.com website in US searches. Webmaster tools has the Geographic Target set correctly for each site. Does anyone have any ideas or suggestions please? Thanks.
International SEO | | OffSightIT0 -
International Hub site: .uk vs domain vs subdomain
Financial company with 2 sites: 1- Mybrand.com for the US market.
International SEO | | FXDD
2- global.mybrand.com is the hub for international with selection for 10 languages: drop-down allows selecting between mybrand.jp, mybrand.fr, etc Now we have the opportunity to redesign the site from zero and I am exploring to get rid of the subdomain for the global site What would be your preference to use as the international hub? a) mybrand.co.uk: I have to use lawyers to get the URL from squatter b) mybrandGlobal.com : URL easy to get, and can be geo targeted using google webmaster tools. Cons: It might not rank as well as .co.uk in the UK, which is our biggest market c) global.mybrand.com-- pros: keep using it because it is aged and has some authority. Google might now see subdomains as part of TLD, thus making it a valid way to separate international from US .. Cons: SEO best practices advice to avoid subdomains because it might not pass full link value across domains. There is not really different content the subdomain, it is just the hub for international Thanks in advance for the help0 -
Multilingual site - separate domain or all under the same umbrella
this has been asked before with not clear winner. I am trying to sum up pros and cons of doing a multilingual site and sharing the same domain for all languages or breaking it into dedicated subdomains e.g. as an example lets assume we are talking about a french property portal with an english version as well. Assume most of the current incoming links and traffic is from France. A) www.french-name.fr/fr/pageX for the french version www.english-name.com/en/pageX for the english version B) www.french-name.fr/fr/ for the french name (as is) www.french-name.fr/en for the english version the client currently follows approach A but is thinking to move towards B we see the following pros and cons for B take advantage of the french-name.fr domain strength and incoming links scalable: can add more languages without registering and building SE position for each one individually potential issues with duplicate content as we are not able to geotarget differenly on web master tools of google potential dilution of each page's strength as we will now have much more pages under the same domain (double the pages basically) - is this a valid concern? usability/marketing concerns as the name of the site is not in english (but then people looking for a house in France would be at least not completely alien to it) what are your thoughts on this? thanks in advance
International SEO | | seo-cat0