I tried the directorie list of seomoz, but almost all of them charged for the inclusion. This is a black hat situation?
-
I need backlinks for my site, and several places inform that directories are a good place. But they charge for the inclusion. Should I pay? This is a blackhat situation where I'm buying for links?
-
H Naghimiac,
Many directories charge, but this doesn't mean they are black hat. The key concept is editorial inclusion. A directory that accepts anyone is not a directory you want to be associated with. This includes directories filled with porn, gambling and payday loan sites.
On the other hand, the harder it is to get into a directory, the more value it usually passes. This is true even when the directory charges money for "review" services.
Be careful - directory listings are meant to enhance your backlink profile, not act as a foundation.
Here's a helpful article:
http://www.seomoz.org/blog/seo-link-directory-best-practices
Best of luck!
-
Good... im afraid to be at the dark side...in gray side i can accept...thanks
-
Hi Naghirniac,
When you pay for a directory listing, you're paying for the review not the actual link. That being said, unless the directory is quality, don't go for it - money can be better spent elsewhere.
Good luck!
-
In an ideal world you wouldn't have to seek directories, however we aren't there quite yet and so they will give you a small boost in the meantime (whilst you are trying to build up other links).
I noticed that they were payable when I looked at them, I think from the point of view at SEOMoz, they are saying 'if you are going to use directories, these are the best'. Obviously the people that run them know that and thats why they can charge.
It would all depend on what you can afford. If $150 for a small, but likely, boost seems worthwhile to you then go for it.It's something that depends on the business you are working for and the situation (are links the thing what are letting you down)
It's more grey than black, there are much worse tactics you could employ towards link building.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Suggestions on Link Auditing a 70,000 URL list?
I have a website with nearly 70,000 incoming links, since its a somewhat large site that has been online for 19 years. The rate I was quoted for a link audit from a reputable SEO professional was $2 per, and clearly I don't have $140,000 to spend on a link audit 🙂 !! I was thinking of asking you guys for a tutorial that is the Gold Standard for link auditing checklists - and do it myself. But then I thought maybe its easier to shorten the list by knocking out all the "obviously good" links first. My only concern is that I be 100% certain they are good links. Is there an "easiest approach" to take for shortening this list, so I can give it to a professional to handle the rest?
Intermediate & Advanced SEO | | HLTalk0 -
Is this White hat or Black Hat
Can we use Domain masking/URL masking? How google see this? Orignal domain - http://mstylecrazy.comMasked domain - Bestupforyou.comIs this also creates duplicate content? and Is this invite google penality?
Intermediate & Advanced SEO | | Michael.Leonard1 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
Almost no organic traffic
Hi, We have an online store, it is up & running since January 1st. Since then we really didn't see any improvements on our organic traffic at all. About 10% of our traffic is coming from organic search, and more than 20% of organic search actually coming from branded keywords. We haven't paid a lot of attention to SEO so far. I mean, we paid attention to the practices, however we focused on a better customer/user experience more than SEO. We improved our product pages, reduced checkout process to one step, used bigger icons / buttons. According to our customers, our website is pretty easy to navigate and shop. We haven't received any major complaint so far. Except couple of products, all the content we have is original, we didn't use any manufacturer product content or copied from another website. However, looks like all these efforts don't mean a lot to Google, unless we have a solid backlinks. Currently i am considering to make category pages NOINDEX and implement microdata from schema.org. However, Is it good idea to make category pages NOINDEX for an ecommerce website? I would like to hear your comments/recommendations what else we can do to create some organic traffic.
Intermediate & Advanced SEO | | serkie0 -
Whats the best way to revive a directory that was 301'd and now I want to remove that?
Last year i 301'd one of my directories on my site, pointing everything to a different directory. Long story short I am going to sell this product line again and would like to just remove the 301 to that original directory, but I am reading that the 301s are also cached in most browsers for a long time. Has anyone successfully done this and if you did what was it that you had to do? Thanks Mike
Intermediate & Advanced SEO | | SandyEggo0 -
Are there discrepancies between GWT and SEOMoz?
In our keyword rank tracking report, we've dominated a keyword in Google and have secured the slot for years. All evidence points in this direction. In Google Webmaster Tools, however, this particular keyword averages a rank of 6.5. Is anyone else experience these kinds of discrepancies? What is your take on it?
Intermediate & Advanced SEO | | NaHoku0 -
Redirecting www.example.com to www.example.com/directory/
Hi All, There's been some internal debate going back and forth about redirecting the homepage of a site to a directory. There are a few different POVs circulating, one of which is that it's no different than redirecting to a /index page. Basically, the homepage is ranking for the keyword that we want the directory to rank for but I can't seem to justify placing this type of redirect. The content on both pages is different, but for the term both the homepage and the directory make sense to rank. Has anyone ever done anything like this before? Can anyone see any reason to do something like this? I believe this move would dilute the link value we currently have going to the homepage and potentially cause us to lose our #2 slot with the homepage in favor of a lower spot with the directory. I'd love to hear any thoughts on this/learn if anyone has experimented with this tactic. Thanks in advance!
Intermediate & Advanced SEO | | JamieCottle280 -
Removing URLs in bulk when directory exclusion isn't an option?
I had a bunch of URLs on my site that followed the form: http://www.example.com/abcdefg?q=&site_id=0000000048zfkf&l= There were several million pages, each associated with a different site_id. They weren't very useful, so we've removed them entirely and now return a 404.The problem is, they're still stuck in Google's index. I'd like to remove them manually, but how? There's no proper directory (i.e. /abcdefg/) to remove, since there's no trailing /, and removing them one by one isn't an option. Is there any other way to approach the problem or specify URLs in bulk? Any insights are much appreciated. Kurus
Intermediate & Advanced SEO | | kurus1