Alexa Rank and Linking from Article sites.
-
We are creating unique content and submitting our articles to article sites. I have some questions about the best way to go about this.
1. We are being very careful to create unique content for each submission - so we are not submitting the same article to multiple sites. Each submission is unique, so 1 article per 1 article directory.
2. When I did my research about these article sites at Alexa.com, I noticed that a lot of the article sites are ranking very well globally, but that a lot of them are #1 in Alexa for India. They are still ranked for other countries with very top ranking, for example, they may 9,000 Alexa rank in India and then 18,000 in the U.S. which is still very high.
3. We are trying to reach U.S. customers mostly, so I am wondering if we are still getting value by linking to these sites who have global reach (even though they are ranked best for India).
I would think that this is very beneficial still, but I didn't want to get the wrong kind of traffic by getting links from sites that are primarily getting their traffic from India, even though they are also getting tons of traffic from the U.S. - I am assuming this is OK because a 18,000 or 19,000 Alexa Rank in the U.S. is still excellent and I will benefit by this. But I wanted to be sure.
Feedback?
-
Why wouldn't you submit to multiple high ranking content directories sites? If a few pick you up great....
Are you worried of a dupe content penalty? Are you using C tags to avoid this?
-
Hi, James, that's what most people are telling me about Alexa. But I still have not had a single answer about the India question which was my main concern. thank you.
-
thanks for the input. I agree it's a good method. why I think there has been debate also is that people usually submit the same article to hundreds of sites or dozens of sites. Someone alerted me to the fact that it's better to write 1 unique article per site and not to submit it elsewhere. that seems to be working so far, because it's good original content.
-
Alexa is not the best way to track a website, it only looks at people who use the alexa tool bar, it is evident that many indian webmasters are using it.
I would look at OSE data for the domain, PR, how many pages and most importantly check out other articles on the site too see the pages it is going on...
-
Got ya. I misunderstood from the original post. In that case, there's nothing wrong with what you're doing, and it's a fairly popular method for building links (although there's a bit of an ongoing debate about using this method).
Best of luck!
-
I'm just looking to increase search rankings in this instance. I'm doing so by providing quality and unique content to article sites with a link back to our site.
-
Yes, it is compatible with both Firefox and Chrome. You can find it here: http://www.seomoz.org/seo-toolbar
I'm not sure we're distinguishing between traffic from the article sites and traffic from Google search. Are you looking to bring visitors directly from the article sites or are you looking to increase your search rankings?
-
Thanks for the info, Julie. I'll look into MozRank - but the question remains in terms of sites that have strong traffic from countries like India. Is it bad to get links from these sites even if they also rank in the U.S.? My experience has shown that these sites have so far helped, since they have strong presence globally, even if the majority is from India, they also seem to have strong share in other countries as well. Any thoughts on that?
-
Jeffrey,
Thanks for the suggestion. Where do I get the Mozrank extension and does it work w/ Firefox?
What is wrong with article sites if they have PR4-7 and the content submitted is unique? We are seeing tangible results so far. We also do daily blogs and content on our own site.
-
Jeffrey,
Thanks for the suggestion. Where do I get the Mozrank extension and does it work w/ Firefox?
What is wrong with article sites if they have PR4-7 and the content submitted is unique? We are seeing tangible results so far. We also do daily blogs and content on our own site.
-
I would not use Alexa as your basis for evaluating site strength -- in my experience Alexa numbers are not only wildly innacurate, they're not even useful qualitatively. For example, one site I own gets about 17k visits per day. It has an alexa rank of 33,000 in the US. Another site I work on gets about 100 visits per day. It has an alexa rank of 32,000.
The two sites are miles apart, but Alexa not only doesn't see that, but actually misjudges which is more popular. This is true again and again with Alexa rankings. I imagine the problem is with the incredibly small sample set of toolbars users, combined with the fact that there's probably some niche bias among the users.
MozRank or MozTrust are both far better metrics for the SEO benefit of a link (as is just searching for various keywords and seeing if the directory actually ranks -- which I'll bet it doesn't, being an article directory). I haven't yet seen a good 3rd party source for the actual traffic of a site.
-
Have you considered using the Mozrank instead of Alexa? This might be a better metric, plus it's easy to see if you use the MozBar extension for your browser. I definitely recommend this instead if you're attempt to obtain links for SEO value on any site (not just limited to article sites).
I'm not sure I would recommend article marketing for traffic like you're going after. Creating great content on your own site or guest posting on related industry blogs will almost certainly be a better strategy than submitting to general article sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to interlink 16 different language versions of site?
I remember that Matt Cutts recommended against interlinking many language versions of a site.
International SEO | | lcourse
Considering that google now also crawls javascript links, what is best way to implement interlinking? I still see otherwhise extremely well optimized large sites interlinking to more than 10 different language versions e.g. zalando.de, but also booking.com (even though here on same domain). Currently we have an expandable css dropdown in the footer interlinking 16 different language versions with different TLD. Would you be concerned? What would you suggest how to interlink domains (for user link would be useful)?0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi,
International SEO | | Awaraman
I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons and how important is it to include keywords to folders and page URLs. Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I undertstand that this is a common practice to use subdomains or folders to separate the language versions. My question is regarding the subfolder. Is it better to have only the subfolder shown (www.example.com/en) or should you also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam0 -
Ranking issues for UK vs US spelling - advice please
Hi guys, I'm reaching out here for what may seem to be a very simple and obvious issue, but not something I can find a good answer for. We have a .com site hosted in Germany that serves our worldwide audience. The site is in English, but our business language is British (UK) English. This means that we rank very well for (e.g.) optimisation software but optimization software is nowhere to be found. The cause of this to me seems obvious; a robot reading those two phrases sees two distinct words. Nonetheless, having seen discussions of a similar nature around the use of plurals in keywords, it would seem to me that Google should have this sort of thing covered. Am I right or wrong here? If I'm wrong, then what are my options? I really don't want to have to make a copy of the entire site; apart from the additional effort involved in content upkeep I see this path fraught with duplicate content issues. Any help is very much appreciated, thanks.
International SEO | | StevenHowe0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0 -
Multi Language / target market site
What is the best way to deal with multiple languages and multiple target markets? Is it better to use directories or sub-domains: English.domain.com Portuguese.domain.com Or Domain.com Domain.com/Portuguese Also should I use language meta tags to help the different language versions rank in different geographic areas e.g. Are there any examples of where this has been done well?
International SEO | | RodneyRiley0 -
Very fluctuating rankings
For a couple of years we're doing internet marketing for a local car rental company in Holland that operates at Amsterdam Airport and competes with players like Herz, Europcar, Avis, ... We started with SEO, SEA & Remarketing. With an overall conversion rate of 13% the results are pretty good, and our client is happy 🙂 Whieee End 2011 we started with an evaluation of the efforts we did last year(s) and to come to new insights to work on in 2012. The SEA & remarketing campaings have a (very) possitive ROI, and they give us lots of insights where we should focus on doing SEO. We identified longtail keywords with a high search volume but when monitoring these keywords in SEOmoz we see they are fluctuating. Some keywords gain ranking positions in google.nl but lose positions in google.com and vica versa. That's pretty frustrating, because we want to rank good in both 🙂 I was wondering what's the best practice for these longtails... Do we make specific landingspages that focus on these longtails, or do we focus on linkbuilding getting links with longtail anchor texts to the homepage of the website? An example: last week we rose 14 positions for a keyword in google.nl but for the same keyword we dropped 4 places in google.com (while .com is much more interesting for our client) Any suggestions are welcome!!
International SEO | | nvs.nim1 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0 -
What's the best strategy for checking international rankings?
Hi There- I am looking to optimize sites serving the UK and Austrailia markets. I feel like I have a good handle on how to go about doing that, but what I am fuzzy on is, what's the best way to monitor the SERPs for the keywords I am targeting. I know based on experience that if I just search google.com.au from here in the states, my results will be 'americanized' and may/probably won't accurately reflect what someone would see if they were search from Austrailia. Are there any good tools or tactics for seeing what searchers in the countries I am focusing on woudl see? Thanks! Jason
International SEO | | phantom0