Anyone Used ScrapeBox or SEONukeX Before?
-
I have been looking at trying out Scrapebox or SEONukex for a while, but don't want to wast my money. Has anyone tried them out with positive success? I am not looking for an automated submission platform necessarily. I am simply looking for a platform to tell me which sites are relevant to mine, dofollow, etc. That is what I would be using them for.
-
Scrapebox is an excellent tool for blog and forum discovery. SENukeX doesn't really help at all in that department and is really only a decent tool if you find creative ways to use it like building your own blog networks.
-
Yes, thank you for that information. I was not wanting to use scrapebox as a means to auto-generate links. I am only needing something to help me in the discovery process of finding sites to get links from.
-
Key Keri,
Thanks for the link, was a good read. Funny thing, once I figured out how SENuke spins articles, I started to notice them. Several times I have found myself reading an article and think to myself that I would really enjoy reading the original version of that article. Frankly, I can't stand spun articles and hopefully people and search engines can learn the difference between an original article and a spun one. I really can't stand spun articles and think anyone doing it should be penalized.
Having said that, if I can read a spun article, and I probably have, and I don't notice. Good enough for me.
I would also expect the search engines to be a little more aggressive about spun articles than they are about paid links. Your competitor is much less likely to spin articles on your behalf than they are to build crappy links for you.
David
-
Check out this thread from earlier this month, where someone was evaluating SENuke and decided against it. You can read his experience and the opinion of other people as well. Generally, it was not a positive opinion.
-
I loaded it on my computer and it looked like it was hard to use, at best. After educating myself more about what SEO really is, I decided against actually using it. IMO it may have been good at one time, but I think the search engines are getting wise to this kind of thing. It looks like a really good way to get sandboxed to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using CSS to hide anchor text
Hi all, In my website, I would like to use CSS to set the anchor text to "website design service"(my company provides web design service) but show the button text as "website", due to some artistic reasons. Therefore, the anchor text for the link is "website design service" but what users see is "websites". Does this sound spammy to Google? Is it a risky move that might hurt my SEO? Looking for some advises here. Thank you very much. Best,
Intermediate & Advanced SEO | | Raymondlee0 -
Using folder blocked by robots.txt before uploaded to indexed folder - is that OK?
I have a folder "testing" within my domain which is a folder added to the robots.txt. My web developers use that folder "testing" when we are creating new content before uploading to an indexed folder. So the content is uploaded to the "testing" folder at first (which is blocked by robots.txt) and later uploaded to an indexed folder, yet permanently keeping the content in the "testing" folder. Actually, my entire website's content is located within the "testing" - so same URL structure for all pages as indexed pages, except it starts with the "testing/" folder. Question: even though the "testing" folder will not be indexed by search engines, is there a chance search engines notice that the content is at first uploaded to the "testing" folder and therefore the indexed folder is not guaranteed to get the content credit, since search engines see the content in the "testing" folder, despite the "testing" folder being blocked by robots.txt? Would it be better that I password protecting this "testing" folder? Thx
Intermediate & Advanced SEO | | khi50 -
When is it recommended to use a self referencing rel "canonical"?
In what type of a situation is it the best type of practice to use a self referencing rel "canonical" tag? Are there particular practices to be cautious of when using a self referencing rel "canonical" tag? I see this practice used mainly with larger websites but I can't find any information that really explains when is a good time to make use of this practice for SEO purposes. Appreciate all feedback. Thank you in advance.
Intermediate & Advanced SEO | | SEO_Promenade0 -
Should I use rel=canonical on similar product pages.
I'm thinking of using rel=canonical for similar products on my site. Say I'm selling pens and they are al very similar. I.e. a big pen in blue, a pack of 5 blue bic pens, a pack of 10, 50, 100 etc. should I rel=canonical them all to the best seller as its almost impossible to make the pages unique. (I realise the best I realise these should be attributes and not products but I'm sure you get my point) It seems sensible to have one master canonical page for bic pens on a site that has a great description video content and good images plus linked articles etc rather than loads of duplicate looking pages. love to hear thoughts from the Moz community.
Intermediate & Advanced SEO | | mark_baird0 -
Using IP to deliver different sidebar content on homepage
We have a site with a generic top level domain and we'd like to use a small portion of the homepage to cater content based on the IP of a visiting user. The content is for product dealerships around different regions/states of the US, not internationally. The idea being that someone from Seattle would see dealerships for this product near their location in Seattle. The section on the homepage is relatively small and would churn out 5 links and images according to location. The rest of the homepage would be the same for everyone, which includes links to news and reviews and fuller content. We have landing pages for regional/state content deeper in the site that don't use an IP to deliver content and also have unique URLs for the different regions/states. An example being a "Washington State Dealerships" landing page with links to all the dealerships there. We're wondering what kind of SEO impact there would be to having a section of the homepage delivering different content based on IP, and if there's anything we should do about it (or if we should be doing it all!). Thank you.
Intermediate & Advanced SEO | | seoninjaz0 -
Question For Anyone
Hi All, Would you be able to answer one small question If you go to Australian Google - www.google.com.au and search for "loans" on positions number # 38 you will see the following site paydayloansyouknow.com.au . It has only 3 pages , 0 links, PA 1,and DA 1 How it's possible to archive such results? This is the print screen in case you dont see what i am asking about
Intermediate & Advanced SEO | | Webdeal
( http://www.freeimagehosting.net/oa75d Will appreciate any answer?0 -
Will using a service such as Akamai impact on rankings?
Howdy 🙂 My client has a .com site they are looking at hosting via Akamai - they have offices in various locations, e.g UK, US, AU, RU & in some Asian countries. If they used Akamai, would the best approach be to set up seperate sites per country: .co.uk .com .com.au .ru .sg etc Although my understanding is that Googlebot is located in the US so if it crawled any of those sites it would always get a US IP address? So is the answer perhaps to go with Akamai for the .com only which should target the US market and use different / seperate C class hosts for the others? Thanks! Woj
Intermediate & Advanced SEO | | wojkwasi0 -
Do any of you regularly use expired domains?
I know there has been discussion on using expired domains in the past. This is not so much a question as to how to do it or whether it works, but rather I would love to see how many of you use this in your backlink strategy. I have a domain in a low to moderately competitive niche that ranks really well, mostly on the power of a couple of expired domains. I bought the domains, created a quick wordpress site and pointed some anchor texted links to the site. It took some time for the expired domains to regain their PR, but when they did, the benefit was great. I'm considering whether I want to do this with another domain of mine. On one hand, it's a relatively inexpensive way to get some good quality anchor texted links. But, on the other hand, something in it feels "immoral" or "sneaky" to me. What do you think?
Intermediate & Advanced SEO | | MarieHaynes0