How many directory submission per day we have to do ?
-
Hello Moz Members,
I have read in many forums and articles, where people discuss about "How many directory submission to do per day" Please clarify my question which are mention below.
-
Is there any per day limit for Directory submission, If its then how much ?
-
Getting more links from directory submission, can hurt my site ?
Regards & Thanks,
Chhatarpal Singh
-
-
_Eternal dilemma of an SEO professional. Since you are there to build links, you have to think about building links and this is exactly where the problem creeps in. Rather take a different approach here. Think like a general user. Would you love to see your website listed in that directory? Do you believe that that the directory in question would be able to drive some traffic to your website? If the answer is yes, go ahead mate. Get your website listed there. Google or no Google, your website is going to get benefited at the end. _
-
When I said " good directories " I meant http://www.seomoz.org/directories/ Do you think these directories will cause Penguin signal to be triggered ? As of my 2nd advice, do it on the right pace. Few factors need to be determine how much to do per x time. Obviously I will agree that directories submission you can find on commercial seo tools will trigger Penguins signals and one should avoid them. Are we good?
-
Do you have an explict answer to those questions that will avoid a Penguin problem?
-
Guys, why to be negative? Today is December 25th
Mr. Singh didn't say what "Directory submission" he meant. There are very good directories that we have to use, anybody disagree?
Regards the other part of "day limit" for Directory submission, the pace is all based on two factors:
- how many links and their quality you have now?
- can you keep the same pace overtime month in, month out?
-
I more or less agree with EGOL. Directory Submissions are a thing of the past and are likely to get you in trouble nowadays. Getting backlinks is becoming harder and harder everyday. You need to diversify more and make sure that all those links to you look as natural as possible. It's not a bad thing to do linking yourself to get that initial push but the best possible linking strategy is a naturally occurring one. Make sure you use all relevant social avenues open to you... Facebook pages, G+, LinkedIn, Pinterest, Instagram, StumbleUpon, and so on as long as it make sense for your site to be there and you keep up with posting. Hopefully those will generate natural links back to your site as people learn who you are and grow to like your site.
-
Thank you sir for valuable suggestion, So what link building strategies shall i apply to get rank my keywords in Google 1st page.
-
I think that they can be harmful to your site - especially if you use keyword anchor text.
If these are the only types of links that you have, I think that your site will be hit by Penguin.
-
I dint get you sir.
-
If you do one or two per day... it will be enough to get you in trouble by the end of next year.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does my site have so many crawl errors relating to the wordpress login / captcha page
Going through the crawl of my site, there were around 100 medium priority issues, such as title element too short, and duplicate page title, and around 80 high priority issues relating to duplicate page content - However every page listed with these issues was the site's wordpress login / captcha page. Does anyone know how to resolve this?
Technical SEO | | ZenyaS0 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Too many links?
Hello! I've just started with SEOmoz, and am getting an error about too many links on a few of my blog posts - it's pages with high numbers of comments, and the links are coming from each commenter's profile (hopefully that makes sense they're not just random stuffed links). Is there a way to help this not cause a problem? Thanks!
Technical SEO | | PaulineMagnusson0 -
Why has my keyword dropped and risen so much in only 7 days?
Sorry if this question is really obvious (or unanswerable) but my client asked me for an explanation and as a relative SEO 'noob' I couldn't really answer them. The client's site http://www.solentlep.org.uk/ dropped 29 places for the keyword LEP last week (all the other keywords had small rises and falls) and then this week has jumped back up the same 29 places. I made no changes to the site in the 7 period (as I hadn't worked out what to change) and although everything is fine now I'd still like to let the client know what caused the blip, or is it one of those things that 'nobody knows'.
Technical SEO | | bendyman0 -
A huge drop in rankings since last 10 days, and not recovered yet.
Hi Mozzers, I have a serious topic to discuss and want help from the experts here. Our website has 6 PR and we have been consistency staying at the top for very competitive terms in the niche. Since last Friday (24th February, 2012) we have been facing massive fluctuation in the rankings for most of the keywords we are focusing on. After this fall, we checked the following details but didn’t find any serious/critical issue that might be contributing towards these fluctuations:- We analyzed Google webmaster tools, there’s no update/warning from Google regarding any negative activity and other things seem to be normal. We checked our website through site search (site: www.domain.com) and found that we haven’t lost any indexed pages and things appear normally as they used to. So, we are sure that we haven’t been banned or penalized. We also cross verified our link building and other promotional activities and we didn’t find anything suspicious that could lead to such a big fluctuation. The drop is really big, some keywords went to 5th or 6th page from top 3 position; some keywords are not in top 200 or 300 spots which were usually staying put between 5th to 10th position. We have analyzed a lot but haven’t come to know the reason why we are facing this fluctuation. Our website is 4 years old and this kind of fluctuation has happened for the first time. Has anyone faced this kind of issue before? I’m looking forward to your support in identifying this trouble. Thanks
Technical SEO | | ValSmith0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0 -
Site:www.tld.com rank is it a measure of googles per page importance?
Hello, does the order of pages in a site:www.tld.com search show how important each page is to google? what if the homepage is not the first result?
Technical SEO | | adamzski0 -
Link juice distributed to too many pages. Will noindex,follow fix this?
We have an e-commerce store with around 4000 product pages. Although our domain authority is not very high (we launched our site in February and now have around 30 RD's) we did rank on lots of long tail terms, and generated around 8000 organic visits / month. Two weeks ago we added another 2000 products to our existing catalogue of 2000 products, and since then our organic traffic dropped significantly (more than 50%). My guess is that link juice has been distributed to too many pages, causing rankings to drop on overall. I'm thinking about noindexing 50% of the product pages (the ones not receiving any organic traffic). However, I am not sure if this will lead to more link juice for the remaining 50% of the product pages, or not. So my question is: if I noindex,follow page A, will 100% of the linkjuice go to page B INSTEAD of page A, or will just a part of the link juice flow to page B (after flowing through page A first)? Hope my question is clear 🙂 P.s. We have a Dutch store, so the traffic drop is not a Panda issue 🙂
Technical SEO | | DeptAgency0