Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Directory and Classified Submissions
-
Are directory submissions and Classified Submissions still a good way to create backlinks?
Or they are obsolete methods and should be discontinued?
-
Thanks for the awesome comments Cyrus.
So what you suggest is going slow and develop solid, long term and genuine links by commenting to blog posts in the same niche, good quality PR submissions. good quality Article submissions and making relevant form posts?
Would you like to add something to the above list?
-
Yes, Google is smart enough.
Two years ago Google stripped toolbar PageRank from most of the profiles on SEOmoz. The links here still pass some value, but build too many of these links and you're much more likely to incur a penalty today.
This entire school of link building discussed on this page is dangerous - and filled with snake oil salesman who care nothing about harming your rankings. I've never heard of Linklecious, but it looks like they've been de-indexed by Google, so it's safe to say they were penalized.
Instead of risking burning your site to the ground with low quality links, invest your time in long-term payouts by producing good quality content and earning the links others can't earn.
-
Hi KS,
In a short answer, directory submissions have been abused over the years and we've seen a marked decrease in their effectiveness even as recently as the past 12 months.
The old rule for directory links was to build them in a 2:1 ratio. That meant for every two directory links, be sure to build at least 1 regular, high quality link. Today, the ratio is more like 1:1, or even reversed to 1:2.
If a directory is easy to get into, it's probably not worth your time. Too many of these links can lead to a drop in rankings. Done judiciously, they can give a small boost to your rankings, help round out your link profile, and help target specific anchor text phrases (again, when done in moderation)
Here's an article we published a few months back you might find helpful: http://www.seomoz.org/blog/seo-link-directory-best-practices
As for classified submissions, I'd be wary as I've never seen any evidence that they help SEO, and like low value directory links, too many "easy" links can harm your rankings.
Hope this helps. Best of luck with your SEO!
-
Thanks Herald.
Please reply to my query regarding Profile Links and Web2.0 Links later down on this page.
-
Yes you should definitely continue with the directory & Classified submission. It will help you a lot. If you have any query then surely concerned with me.
Thanks
-
What about Profile Links and Web2.0 Links. They say its good to create 100+ profile links every month with appropriate keywords (while SEOing a website). Some say Profile links are better than Form Links or Blog Comments.
If I use services to create them, most of the links turn out to be on not-so-good Websites. But they say at the end of the day its about Backlinks.
My question is: Isn't Google smart enough to detect such practices?
In other word: Do profile links really help?
-
Hmmm useful tips. Thanks.
How about submitting those pages (URLs) to Linklecious or Pingomatic so they are crawled by Google?
Would that help?
-
If they follow then you could also check:
-
whether the directory actually has some ranking already and has been around for a while
-
how many links they usually put on one page before paging to the next one
-
check if the listing pages of the recently added entries (usually last ones) are already in Google index - I normally do it by checking their PageRank - if it's at least white (I use Quirk SearchStatus plugin for Firefox) that means Google is already aware of them - which indicates that they crawl this directory pretty frequently for new content / pages
-
try to perform a search on Google for a specific keyword - the category you want to submit your link to and see what's their position for it
Obviously you would do all this if you had a lot of time to go through each directory separately, but it might be worth wile if you're planning to get some links in the different way then the generic links from the visitors.
-
-
Yup we always do make sure they are DO FOLLOW classified or directory sites.
-
Most of the submissions will give you a link with attribute rel set to nofollow - and these don't really give you any SEO benefit, so it is quite important to first check the other listings and see whether they have this attribute and value assigned to the anchor - if so, then the only benefit you will get is the visit if someone actually clicks on the link in the listing and gets to your site.
-
Thanks for the help. So basically we should continue with them right?
-
Hi KS_,
As you know that Directory submission mostly used by all the business companies for the promotion of their websites or products.
The directory & classified submissions are mainly used for getting listing in all major & useful directories.
The both above submissions are used according to the target area or population.Classified submissions are similar to yellow pages.The most important thing to remember while doing listing is that it should be of same niche where the listing is to be done. This will leads to indexed pages & rank higher in search engines & it will helps to gain visibility via search engines.
The above both submission helps to increase in sales of products, website presence specially in search engines.These will helps to gain non reciprocal back links from high Page Rank Websites/ Directories.
I hope that your query had been solved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using a Reverse Proxy and 301 redirect to appear Sub Domain as Sub Directory - what are the SEO Risks?
We’re in process to move WordPress blog URLs from subdomains to sub-directory. We aren’t moving blog physically, but using reverse proxy and 301 redirection to do this. Blog subdomain URL is https://blog.example.com/ and destination sub-directory URL is https://www.example.com/blog/ Our main website is e-commerce marketplace which is YMYL site. This is on Windows server. Due to technical reasons, we can’t physically move our WordPress blog to the main website. Following is our Technical Setup Setup a reverse proxy at https://www.example.com/blog/ pointing to https://blog.example.com/ Use a 301 redirection from https://blog.example.com/ to https://www.example.com/blog/ with an exception if a traffic is coming from main WWW domain then it won’t redirect. Thus, we can eliminate infinite loop. Change all absolute URLs to relative URLs on blog Change the sitemap URL from https://blog.example.com/sitemap.xml to https://www.example.com/blog/sitemap.xml and update all URLs mentioned within the sitemap. SEO Risk Evaluation We have individual GA Tracking ID and individual Google Search Console Properties for main website and blog. We will not merge them. Keep them separate as they are. Keeping this in mind, I am evaluating SEO Risks factors Right now when we receive traffic from main website to blog (or vice versa) then it is considered as referral traffic and new cookies are set for Google Analytics. What’s going to happen when its on the same domain? Which type of settings change should I do in Blog’s Google Search Console? (A). Do I need to request “Change of Address” in the Blog’s search console property? (B). Should I re-submit the sitemap? Do I need to re-submit the blog sitemap from the https://www.example.com/ Google Search Console Property? Main website is e-commerce marketplace which is YMYL website, and blog is all about content. So does that impact SEO? Will this dilute SEO link juice or impact on the main website ranking because following are the key SEO Metrices. (A). Main website’s Avg Session Duration is about 10 minutes and bounce rate is around 30% (B). Blog’s Avg Session Duration is 33 seconds and bounce rate is over 92%
Intermediate & Advanced SEO | | joshibhargav_200 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Subdomains vs directories on existing website with good search traffic
Hello everyone, I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it. We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go. So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits. All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains. Thank you in advance.
Intermediate & Advanced SEO | | damienthivolle0 -
Robots.txt, does it need preceding directory structure?
Do you need the entire preceding path in robots.txt for it to match? e.g: I know if i add Disallow: /fish to robots.txt it will block /fish
Intermediate & Advanced SEO | | Milian
/fish.html
/fish/salmon.html
/fishheads
/fishheads/yummy.html
/fish.php?id=anything But would it block?: en/fish
en/fish.html
en/fish/salmon.html
en/fishheads
en/fishheads/yummy.html
**en/fish.php?id=anything (taken from Robots.txt Specifications)** I'm hoping it actually wont match, that way writing this particular robots.txt will be much easier! As basically I'm wanting to block many URL that have BTS- in such as: http://www.example.com/BTS-something
http://www.example.com/BTS-somethingelse
http://www.example.com/BTS-thingybob But have other pages that I do not want blocked, in subfolders that also have BTS- in, such as: http://www.example.com/somesubfolder/BTS-thingy
http://www.example.com/anothersubfolder/BTS-otherthingy Thanks for listening0 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
XML Sitemap for classifieds
I have seeon some trends for sites which do not even use XML sitemp and robots e.g. see this site. How do you see if sitemap is not used. Also for classified websites, should ad pages be included in sitemap because after certain duration those ads will be deleted and google might not be able to crawl. What do you suggest about XML sitemap for classified website.
Intermediate & Advanced SEO | | MozAddict0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90