Anyone Used ScrapeBox or SEONukeX Before?
-
I have been looking at trying out Scrapebox or SEONukex for a while, but don't want to wast my money. Has anyone tried them out with positive success? I am not looking for an automated submission platform necessarily. I am simply looking for a platform to tell me which sites are relevant to mine, dofollow, etc. That is what I would be using them for.
-
Scrapebox is an excellent tool for blog and forum discovery. SENukeX doesn't really help at all in that department and is really only a decent tool if you find creative ways to use it like building your own blog networks.
-
Yes, thank you for that information. I was not wanting to use scrapebox as a means to auto-generate links. I am only needing something to help me in the discovery process of finding sites to get links from.
-
Key Keri,
Thanks for the link, was a good read. Funny thing, once I figured out how SENuke spins articles, I started to notice them. Several times I have found myself reading an article and think to myself that I would really enjoy reading the original version of that article. Frankly, I can't stand spun articles and hopefully people and search engines can learn the difference between an original article and a spun one. I really can't stand spun articles and think anyone doing it should be penalized.
Having said that, if I can read a spun article, and I probably have, and I don't notice. Good enough for me.
I would also expect the search engines to be a little more aggressive about spun articles than they are about paid links. Your competitor is much less likely to spin articles on your behalf than they are to build crappy links for you.
David
-
Check out this thread from earlier this month, where someone was evaluating SENuke and decided against it. You can read his experience and the opinion of other people as well. Generally, it was not a positive opinion.
-
I loaded it on my computer and it looked like it was hard to use, at best. After educating myself more about what SEO really is, I decided against actually using it. IMO it may have been good at one time, but I think the search engines are getting wise to this kind of thing. It looks like a really good way to get sandboxed to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using a hash menu system drive SEO power to my sub-pages?
My website (a large professional one) uses a interesting menu system. When a user hovers over text (which is not clickable), then a larger sub-menu appears on the screen, when they hover over something else, then this sub-menu changes or disappears. This menu is driven by a hash(#), which makes me wonder. I this giving my sub-pages an SEO kick? Or... is there another way that we should be doing this in order to get that SEO kick?
Intermediate & Advanced SEO | | adamorn0 -
My Domain authority dropped 9 points... Does anyone have any suggestions to fix this significant drop.
My domain authority dropped by 9 points and I haven't done anything differently since the last scan. What is going on?
Intermediate & Advanced SEO | | infotrust20 -
Can I use the old website content on the new website, after deleting it from the server?
My website nowwhatstudio.com hit by google pure spam and google applied manual spam action to the website. I create new website (nowwhatmoments.com) with the same content from the old spam action website (nowwhatstudio.com). As google removed my old website content from search indexed. Can I use the same content for a new website? If I delete my old website from the server, after that Can I use the old website content for a new website? Or Can make edits the old website content and make it 80% original for a new website?
Intermediate & Advanced SEO | | bondhoward0 -
Using disavow tool for 404s
Hey Community, Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp). It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question. Feel free to ask any questions that may help you understand the issue more. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Using Meta Header vs Robots.txt
Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?
Intermediate & Advanced SEO | | evan890 -
Are URL shorteners building domain authority everytime someone uses a link from their service?
My understanding of domain authority is that the more links pointing to any page / resource on a domain, the greater the overall domain authority (and weight passed from outbound links on the domain) is. Because URL shorteners create links on their own domain that redirect to an off-domain page but link "to" an on-domain URL, are they gaining domain authority each time someone publishes a shortened link from their service? Or does Google penalize these sites specifically, or links that redirect in general? Or am I missing something else?
Intermediate & Advanced SEO | | Jay.Neely0 -
Can Anyone show me a site that has followed the seomoz seo rules
Hi i have been reading the seo information on here which is very interesting and i would like to know if anyone can point to any sites that have followed the rules and advice. It is great when you can read the info and rules but i feel it is also better to see a site that has followed the rules and to hear from people who have followed the information and put them into practice and explain what results they have got. I am currently building the following website http://www.womenlifestylemagazine.com so it would be great to see a site that has followed all the rules and who can explain if they work or not.
Intermediate & Advanced SEO | | ClaireH-1848860 -
Use of the Canonical Tag, Both Internally and Cross Domain
I've seen the cross domain canonical not work at all in my test cases. And an interesting point was brought to my attention today. That point was that in order for the canonical tag to work, the page that you are referencing needs to have the exact same content. And that this was the whole point of the canonical tag, not for it to be used as a 301 but for it to consolidate pages with the same content. I want to know if this is true. Does the page you reference with a canonical tag have to have the same exact content? And what have been your experiences with using the canonical tag referencing another page on a different domain that has the same exact subject matter but not the exact duplicate content?
Intermediate & Advanced SEO | | GearyLSF372