Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How many directory submission per day we have to do ?
-
Hello Moz Members,
I have read in many forums and articles, where people discuss about "How many directory submission to do per day" Please clarify my question which are mention below.
-
Is there any per day limit for Directory submission, If its then how much ?
-
Getting more links from directory submission, can hurt my site ?
Regards & Thanks,
Chhatarpal Singh
-
-
_Eternal dilemma of an SEO professional. Since you are there to build links, you have to think about building links and this is exactly where the problem creeps in. Rather take a different approach here. Think like a general user. Would you love to see your website listed in that directory? Do you believe that that the directory in question would be able to drive some traffic to your website? If the answer is yes, go ahead mate. Get your website listed there. Google or no Google, your website is going to get benefited at the end. _
-
When I said " good directories " I meant http://www.seomoz.org/directories/ Do you think these directories will cause Penguin signal to be triggered ? As of my 2nd advice, do it on the right pace. Few factors need to be determine how much to do per x time. Obviously I will agree that directories submission you can find on commercial seo tools will trigger Penguins signals and one should avoid them. Are we good?
-
Do you have an explict answer to those questions that will avoid a Penguin problem?
-
Guys, why to be negative? Today is December 25th
Mr. Singh didn't say what "Directory submission" he meant. There are very good directories that we have to use, anybody disagree?
Regards the other part of "day limit" for Directory submission, the pace is all based on two factors:
- how many links and their quality you have now?
- can you keep the same pace overtime month in, month out?
-
I more or less agree with EGOL. Directory Submissions are a thing of the past and are likely to get you in trouble nowadays. Getting backlinks is becoming harder and harder everyday. You need to diversify more and make sure that all those links to you look as natural as possible. It's not a bad thing to do linking yourself to get that initial push but the best possible linking strategy is a naturally occurring one. Make sure you use all relevant social avenues open to you... Facebook pages, G+, LinkedIn, Pinterest, Instagram, StumbleUpon, and so on as long as it make sense for your site to be there and you keep up with posting. Hopefully those will generate natural links back to your site as people learn who you are and grow to like your site.
-
Thank you sir for valuable suggestion, So what link building strategies shall i apply to get rank my keywords in Google 1st page.
-
I think that they can be harmful to your site - especially if you use keyword anchor text.
If these are the only types of links that you have, I think that your site will be hit by Penguin.
-
I dint get you sir.
-
If you do one or two per day... it will be enough to get you in trouble by the end of next year.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Is it good practice to still pay for Best of the Web Directory (BOTW) and other similar one's you have to pay for?
I know that paid for links are hit by Google, but in the past these directories were okay. What about now? Thank you.
Technical SEO | | RoxBrock0 -
How Long To Recover Rankings After Multi-Day Site Outage?
Hi, A site we look after for a client was down for almost 3 days at the start of this month (11th - 14th of May, to be exact). This was caused by my client's failure to verify their domain name in accordance with the new ICANN procedures. The details are unimportant, but it took a long while for them to get their domain name registration contact details validated, hence the outage. Very soon after this down time we noticed that the site has slipped back in the Google rankings for most of the target keywords, sometimes quite considerably. I guess this is Google penalizing this client for their failure to keep their site live. (And they really can't have too many complaints about this, in my opinion). The good news is that the rankings show signs of improving again slightly. However, they have not recovered all the way to where they were before the outage, two weeks ago. My question is this ... do you expect that the site will naturally re-gain the previous excellent rankings without us doing anything? If so, how long do you estimate this could take? On the other hand, if Google typically penalizes this kind of error by 'permanently', is there is anything we can do to help signal to Google that the site deserves to get back up to where is used to be? I am keen to get your thoughts, and especially to hear from anyone who has faced a similar problem in the past. Thanks
Technical SEO | | smaavie0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Wordpress categories causing too many links/duplicate content?
I've just added categories to my wordpress site and some of the posts show in several of the categories. Will this cause me duplicate content problems as I want the category pages to be indexed? Also as I add more categories I'm creating more links on the page. They can't be seen to the user as I have a plugin that creates drop down categories. When I go to 'view source' though all the links are there so google will see lots of links. How can I fix the too many links problem? And should I worry about duplicate content issue?
Technical SEO | | SamCUK1 -
Too Many On-Page Links - caused by a drop down menu
Many of our e-com sites we build for customers have drop down menus to help the user easily find products without having to click - Example: http://www.customandcommercial.com/ But this then causes the report to trigger too many on page links We do have a site map and a google site map So should I put code in place not to follow the drop down menu link items or leave in place?
Technical SEO | | spiralsites0 -
How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
Firstly,I am getting around 12 Errors in the category 4xx Client error. The description says that this is either bad or a broken link.How can I repair this ? Secondly, I am getting lots of warnings related to too many page links of a single page.I want to know how to tackle this ? Finally, I don't understand the basics of Cannonical notices.I have around 12 notices of this kind which I want to remove too. Please help me out in this regard. Thank you beforehand. Amit Ganguly http://aamthoughts.blogspot.com - Sustainable Sphere
Technical SEO | | amit.ganguly0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0