Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How many directory submission per day we have to do ?
-
Hello Moz Members,
I have read in many forums and articles, where people discuss about "How many directory submission to do per day" Please clarify my question which are mention below.
-
Is there any per day limit for Directory submission, If its then how much ?
-
Getting more links from directory submission, can hurt my site ?
Regards & Thanks,
Chhatarpal Singh
-
-
_Eternal dilemma of an SEO professional. Since you are there to build links, you have to think about building links and this is exactly where the problem creeps in. Rather take a different approach here. Think like a general user. Would you love to see your website listed in that directory? Do you believe that that the directory in question would be able to drive some traffic to your website? If the answer is yes, go ahead mate. Get your website listed there. Google or no Google, your website is going to get benefited at the end. _
-
When I said " good directories " I meant http://www.seomoz.org/directories/ Do you think these directories will cause Penguin signal to be triggered ? As of my 2nd advice, do it on the right pace. Few factors need to be determine how much to do per x time. Obviously I will agree that directories submission you can find on commercial seo tools will trigger Penguins signals and one should avoid them. Are we good?
-
Do you have an explict answer to those questions that will avoid a Penguin problem?
-
Guys, why to be negative? Today is December 25th
Mr. Singh didn't say what "Directory submission" he meant. There are very good directories that we have to use, anybody disagree?
Regards the other part of "day limit" for Directory submission, the pace is all based on two factors:
- how many links and their quality you have now?
- can you keep the same pace overtime month in, month out?
-
I more or less agree with EGOL. Directory Submissions are a thing of the past and are likely to get you in trouble nowadays. Getting backlinks is becoming harder and harder everyday. You need to diversify more and make sure that all those links to you look as natural as possible. It's not a bad thing to do linking yourself to get that initial push but the best possible linking strategy is a naturally occurring one. Make sure you use all relevant social avenues open to you... Facebook pages, G+, LinkedIn, Pinterest, Instagram, StumbleUpon, and so on as long as it make sense for your site to be there and you keep up with posting. Hopefully those will generate natural links back to your site as people learn who you are and grow to like your site.
-
Thank you sir for valuable suggestion, So what link building strategies shall i apply to get rank my keywords in Google 1st page.
-
I think that they can be harmful to your site - especially if you use keyword anchor text.
If these are the only types of links that you have, I think that your site will be hit by Penguin.
-
I dint get you sir.
-
If you do one or two per day... it will be enough to get you in trouble by the end of next year.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product schema with no offer as owner wants to give price per customer
Hi, Trying to markup products for a site that does not show prices. Is there any way to markup a product price when the business model is: 1. customer calls or contacts shop. 2. shop gives a price quote based on level of detail and finish on the product 3. there is no base or top price. Thanks in advance!
Technical SEO | | plahpoy0 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Root directory vs. subdirectories
Hello. How much more important does Google consider pages in the root directory relative to pages in a subdirectory? Is it best to keep the most important pages of a site in the root directory? Thanks!
Technical SEO | | nyc-seo0 -
Wordpress categories causing too many links/duplicate content?
I've just added categories to my wordpress site and some of the posts show in several of the categories. Will this cause me duplicate content problems as I want the category pages to be indexed? Also as I add more categories I'm creating more links on the page. They can't be seen to the user as I have a plugin that creates drop down categories. When I go to 'view source' though all the links are there so google will see lots of links. How can I fix the too many links problem? And should I worry about duplicate content issue?
Technical SEO | | SamCUK1 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | | TruvoDirectories0 -
Too Many On-Page Links - caused by a drop down menu
Many of our e-com sites we build for customers have drop down menus to help the user easily find products without having to click - Example: http://www.customandcommercial.com/ But this then causes the report to trigger too many on page links We do have a site map and a google site map So should I put code in place not to follow the drop down menu link items or leave in place?
Technical SEO | | spiralsites0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0