Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Directory and Classified Submissions
-
Are directory submissions and Classified Submissions still a good way to create backlinks?
Or they are obsolete methods and should be discontinued?
-
Thanks for the awesome comments Cyrus.
So what you suggest is going slow and develop solid, long term and genuine links by commenting to blog posts in the same niche, good quality PR submissions. good quality Article submissions and making relevant form posts?
Would you like to add something to the above list?
-
Yes, Google is smart enough.
Two years ago Google stripped toolbar PageRank from most of the profiles on SEOmoz. The links here still pass some value, but build too many of these links and you're much more likely to incur a penalty today.
This entire school of link building discussed on this page is dangerous - and filled with snake oil salesman who care nothing about harming your rankings. I've never heard of Linklecious, but it looks like they've been de-indexed by Google, so it's safe to say they were penalized.
Instead of risking burning your site to the ground with low quality links, invest your time in long-term payouts by producing good quality content and earning the links others can't earn.
-
Hi KS,
In a short answer, directory submissions have been abused over the years and we've seen a marked decrease in their effectiveness even as recently as the past 12 months.
The old rule for directory links was to build them in a 2:1 ratio. That meant for every two directory links, be sure to build at least 1 regular, high quality link. Today, the ratio is more like 1:1, or even reversed to 1:2.
If a directory is easy to get into, it's probably not worth your time. Too many of these links can lead to a drop in rankings. Done judiciously, they can give a small boost to your rankings, help round out your link profile, and help target specific anchor text phrases (again, when done in moderation)
Here's an article we published a few months back you might find helpful: http://www.seomoz.org/blog/seo-link-directory-best-practices
As for classified submissions, I'd be wary as I've never seen any evidence that they help SEO, and like low value directory links, too many "easy" links can harm your rankings.
Hope this helps. Best of luck with your SEO!
-
Thanks Herald.
Please reply to my query regarding Profile Links and Web2.0 Links later down on this page.
-
Yes you should definitely continue with the directory & Classified submission. It will help you a lot. If you have any query then surely concerned with me.
Thanks
-
What about Profile Links and Web2.0 Links. They say its good to create 100+ profile links every month with appropriate keywords (while SEOing a website). Some say Profile links are better than Form Links or Blog Comments.
If I use services to create them, most of the links turn out to be on not-so-good Websites. But they say at the end of the day its about Backlinks.
My question is: Isn't Google smart enough to detect such practices?
In other word: Do profile links really help?
-
Hmmm useful tips. Thanks.
How about submitting those pages (URLs) to Linklecious or Pingomatic so they are crawled by Google?
Would that help?
-
If they follow then you could also check:
-
whether the directory actually has some ranking already and has been around for a while
-
how many links they usually put on one page before paging to the next one
-
check if the listing pages of the recently added entries (usually last ones) are already in Google index - I normally do it by checking their PageRank - if it's at least white (I use Quirk SearchStatus plugin for Firefox) that means Google is already aware of them - which indicates that they crawl this directory pretty frequently for new content / pages
-
try to perform a search on Google for a specific keyword - the category you want to submit your link to and see what's their position for it
Obviously you would do all this if you had a lot of time to go through each directory separately, but it might be worth wile if you're planning to get some links in the different way then the generic links from the visitors.
-
-
Yup we always do make sure they are DO FOLLOW classified or directory sites.
-
Most of the submissions will give you a link with attribute rel set to nofollow - and these don't really give you any SEO benefit, so it is quite important to first check the other listings and see whether they have this attribute and value assigned to the anchor - if so, then the only benefit you will get is the visit if someone actually clicks on the link in the listing and gets to your site.
-
Thanks for the help. So basically we should continue with them right?
-
Hi KS_,
As you know that Directory submission mostly used by all the business companies for the promotion of their websites or products.
The directory & classified submissions are mainly used for getting listing in all major & useful directories.
The both above submissions are used according to the target area or population.Classified submissions are similar to yellow pages.The most important thing to remember while doing listing is that it should be of same niche where the listing is to be done. This will leads to indexed pages & rank higher in search engines & it will helps to gain visibility via search engines.
The above both submission helps to increase in sales of products, website presence specially in search engines.These will helps to gain non reciprocal back links from high Page Rank Websites/ Directories.
I hope that your query had been solved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using a Reverse Proxy and 301 redirect to appear Sub Domain as Sub Directory - what are the SEO Risks?
We’re in process to move WordPress blog URLs from subdomains to sub-directory. We aren’t moving blog physically, but using reverse proxy and 301 redirection to do this. Blog subdomain URL is https://blog.example.com/ and destination sub-directory URL is https://www.example.com/blog/ Our main website is e-commerce marketplace which is YMYL site. This is on Windows server. Due to technical reasons, we can’t physically move our WordPress blog to the main website. Following is our Technical Setup Setup a reverse proxy at https://www.example.com/blog/ pointing to https://blog.example.com/ Use a 301 redirection from https://blog.example.com/ to https://www.example.com/blog/ with an exception if a traffic is coming from main WWW domain then it won’t redirect. Thus, we can eliminate infinite loop. Change all absolute URLs to relative URLs on blog Change the sitemap URL from https://blog.example.com/sitemap.xml to https://www.example.com/blog/sitemap.xml and update all URLs mentioned within the sitemap. SEO Risk Evaluation We have individual GA Tracking ID and individual Google Search Console Properties for main website and blog. We will not merge them. Keep them separate as they are. Keeping this in mind, I am evaluating SEO Risks factors Right now when we receive traffic from main website to blog (or vice versa) then it is considered as referral traffic and new cookies are set for Google Analytics. What’s going to happen when its on the same domain? Which type of settings change should I do in Blog’s Google Search Console? (A). Do I need to request “Change of Address” in the Blog’s search console property? (B). Should I re-submit the sitemap? Do I need to re-submit the blog sitemap from the https://www.example.com/ Google Search Console Property? Main website is e-commerce marketplace which is YMYL website, and blog is all about content. So does that impact SEO? Will this dilute SEO link juice or impact on the main website ranking because following are the key SEO Metrices. (A). Main website’s Avg Session Duration is about 10 minutes and bounce rate is around 30% (B). Blog’s Avg Session Duration is 33 seconds and bounce rate is over 92%
Intermediate & Advanced SEO | | joshibhargav_200 -
Should I use the on classified listing pages that have expired?
We have went back and forth on this and wanted to get some outside input. I work for an online listing website that has classified ads on it. These ads are generated by companies on our site advertising weekend events around the country. We have about 10,000 companies that use our service to generate their online ads. This means that we have thousands of pages being created each week. The ads have lots of content: pictures, sale descriptions, and company information. After the ads have expired, and the sale is no longer happening, we are currently placing the in the heads of each page. The content is not relative anymore since the ad has ended. The only value the content offers a searcher is the images (there are millions on expired ads) and the descriptions of the items for sale. We currently are the leader in our industry and control most of the top spots on Google for our keywords. We have been worried about cluttering up the search results with pages of ads that are expired. In our Moz account right now we currently have over 28k crawler warnings alerting us to the being in the page heads of the expired ads. Seeing those warnings have made us nervous and second guessing what we are doing. Does anybody have any thoughts on this? Should we continue with placing the in the heads of the expired ads, or should we be allowing search engines to index the old pages. I have seen websites with discontinued products keeping the products around so that individuals can look up past information. This is the closest thing have seen to our situation. Any help or insight would be greatly appreciated! -Matt
Intermediate & Advanced SEO | | mellison0 -
How to rank if you are an aggregator or a directory of resource?
Most of the SEO suggestions (great quality content, long form content, engagement rate/time on the page, authority inbound links ) apply to content oriented site. But what should you do if you are an aggregator or a resource directory? You aim is to send the user faster to other site they are looking for or provide ranking about the resources. In fact at a very basic level you are competing for search engine traffic because they are doing same things. You may have done a hand crafted, human created resource that is better than what algorithms are showing. And your site likely to have lot more outgoing links than content. You know you are better (or getting better) since repeat visitors keep coming back. So in these days of Search engines, what a resource directory or aggregator site do to rank? Because even directories need first time visitors till they start coming back again.
Intermediate & Advanced SEO | | Maayboli0 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Robots.txt, does it need preceding directory structure?
Do you need the entire preceding path in robots.txt for it to match? e.g: I know if i add Disallow: /fish to robots.txt it will block /fish
Intermediate & Advanced SEO | | Milian
/fish.html
/fish/salmon.html
/fishheads
/fishheads/yummy.html
/fish.php?id=anything But would it block?: en/fish
en/fish.html
en/fish/salmon.html
en/fishheads
en/fishheads/yummy.html
**en/fish.php?id=anything (taken from Robots.txt Specifications)** I'm hoping it actually wont match, that way writing this particular robots.txt will be much easier! As basically I'm wanting to block many URL that have BTS- in such as: http://www.example.com/BTS-something
http://www.example.com/BTS-somethingelse
http://www.example.com/BTS-thingybob But have other pages that I do not want blocked, in subfolders that also have BTS- in, such as: http://www.example.com/somesubfolder/BTS-thingy
http://www.example.com/anothersubfolder/BTS-otherthingy Thanks for listening0 -
Merging Domains... Sub-domains, Directories or Seperate Sites?
Hello! I am hoping you can help me decide the best path to take here... A little background: I'm moving to a new company that has three old domains (the oldest is 10 years old), which get a lot of traffic from their e-letters. Until recently they have not cared about SEO. So the websites have some structural, coding, URL and other issues. The sites are indexed, but have a problem getting crawled and/or indexed for new content - haven't delved into this yet but am certain I will be able to fix any of these issues. These three domains are PR4, PR4, PR5 and contain hundreds of unique articles. Here's the question... They want to move these three sites **to their main company site (PR4) and create sub domains for each one. ** I am wondering if this is a good idea or not. I have merged sites before (creating categories and/or directories) and the end result is that the ONE big site, is much for effective than TWO smaller, less authoritative sites. But the sub domain idea is something I am unsure about from an SEO perspective. Should we do this with sub domains? Or do you think we should keep the sites separate? How do Panda and Penguin play into this? Thanks in advance for the help! SD P.S. I'm not a huge advocate in using PR as a measurement tool, but since I can't reveal the actual domains, I figured I would list it as a reference point.
Intermediate & Advanced SEO | | essdee0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90 -
All page files in root? Or to use directories?
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
Intermediate & Advanced SEO | | Peter264
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following: /images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!0