Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How many directory submission per day we have to do ?
-
Hello Moz Members,
I have read in many forums and articles, where people discuss about "How many directory submission to do per day" Please clarify my question which are mention below.
-
Is there any per day limit for Directory submission, If its then how much ?
-
Getting more links from directory submission, can hurt my site ?
Regards & Thanks,
Chhatarpal Singh
-
-
_Eternal dilemma of an SEO professional. Since you are there to build links, you have to think about building links and this is exactly where the problem creeps in. Rather take a different approach here. Think like a general user. Would you love to see your website listed in that directory? Do you believe that that the directory in question would be able to drive some traffic to your website? If the answer is yes, go ahead mate. Get your website listed there. Google or no Google, your website is going to get benefited at the end. _
-
When I said " good directories " I meant http://www.seomoz.org/directories/ Do you think these directories will cause Penguin signal to be triggered ? As of my 2nd advice, do it on the right pace. Few factors need to be determine how much to do per x time. Obviously I will agree that directories submission you can find on commercial seo tools will trigger Penguins signals and one should avoid them. Are we good?
-
Do you have an explict answer to those questions that will avoid a Penguin problem?
-
Guys, why to be negative? Today is December 25th
Mr. Singh didn't say what "Directory submission" he meant. There are very good directories that we have to use, anybody disagree?
Regards the other part of "day limit" for Directory submission, the pace is all based on two factors:
- how many links and their quality you have now?
- can you keep the same pace overtime month in, month out?
-
I more or less agree with EGOL. Directory Submissions are a thing of the past and are likely to get you in trouble nowadays. Getting backlinks is becoming harder and harder everyday. You need to diversify more and make sure that all those links to you look as natural as possible. It's not a bad thing to do linking yourself to get that initial push but the best possible linking strategy is a naturally occurring one. Make sure you use all relevant social avenues open to you... Facebook pages, G+, LinkedIn, Pinterest, Instagram, StumbleUpon, and so on as long as it make sense for your site to be there and you keep up with posting. Hopefully those will generate natural links back to your site as people learn who you are and grow to like your site.
-
Thank you sir for valuable suggestion, So what link building strategies shall i apply to get rank my keywords in Google 1st page.
-
I think that they can be harmful to your site - especially if you use keyword anchor text.
If these are the only types of links that you have, I think that your site will be hit by Penguin.
-
I dint get you sir.
-
If you do one or two per day... it will be enough to get you in trouble by the end of next year.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Long To Recover Rankings After Multi-Day Site Outage?
Hi, A site we look after for a client was down for almost 3 days at the start of this month (11th - 14th of May, to be exact). This was caused by my client's failure to verify their domain name in accordance with the new ICANN procedures. The details are unimportant, but it took a long while for them to get their domain name registration contact details validated, hence the outage. Very soon after this down time we noticed that the site has slipped back in the Google rankings for most of the target keywords, sometimes quite considerably. I guess this is Google penalizing this client for their failure to keep their site live. (And they really can't have too many complaints about this, in my opinion). The good news is that the rankings show signs of improving again slightly. However, they have not recovered all the way to where they were before the outage, two weeks ago. My question is this ... do you expect that the site will naturally re-gain the previous excellent rankings without us doing anything? If so, how long do you estimate this could take? On the other hand, if Google typically penalizes this kind of error by 'permanently', is there is anything we can do to help signal to Google that the site deserves to get back up to where is used to be? I am keen to get your thoughts, and especially to hear from anyone who has faced a similar problem in the past. Thanks
Technical SEO | | smaavie0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Multilingual Website - Sub-domain VS Sub-directory
Hi Folks - Need your advice on the pros and cons of going with a sub-domain vs a sub-directory approach for a multi lingual website. The best would be a ccTLD but that is not possible now, so I would be more interested in knowing your take on these 2 options. Though, I have gone through http://www.stateofsearch.com/international-multilingual-sites-criteria-to-establish-seo-friendly-structure/ and this somewhat vouches for a sub-directory, but what would you say'?
Technical SEO | | RanjeetP0 -
Root directory vs. subdirectories
Hello. How much more important does Google consider pages in the root directory relative to pages in a subdirectory? Is it best to keep the most important pages of a site in the root directory? Thanks!
Technical SEO | | nyc-seo0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Too Many On-Page Links - caused by a drop down menu
Many of our e-com sites we build for customers have drop down menus to help the user easily find products without having to click - Example: http://www.customandcommercial.com/ But this then causes the report to trigger too many on page links We do have a site map and a google site map So should I put code in place not to follow the drop down menu link items or leave in place?
Technical SEO | | spiralsites0 -
How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
Firstly,I am getting around 12 Errors in the category 4xx Client error. The description says that this is either bad or a broken link.How can I repair this ? Secondly, I am getting lots of warnings related to too many page links of a single page.I want to know how to tackle this ? Finally, I don't understand the basics of Cannonical notices.I have around 12 notices of this kind which I want to remove too. Please help me out in this regard. Thank you beforehand. Amit Ganguly http://aamthoughts.blogspot.com - Sustainable Sphere
Technical SEO | | amit.ganguly0 -
How many strong tags is too many
Hi everyone, just a quick question, what are your views on the use of strong tags in content? how many is too many? What is you have strong tags around every keywords for a sentance etc?
Technical SEO | | pauledwards1