The best tool
-
Hi friends !!
I have a huge question . Which is the best tool for SEO? I am using a lot of tools but I would like to know more ways to position my website in the top .
I hope that you can help me!
Regards ,
Carlos Zambrana
-
Lots of thanks emilia for your help!
-
I use a lot of the ones you've mentioned here too, but in addition: SEM Rush - which I don't think anyone has mentioned yet?
It's good for Rankings, competitor analysis, adwords analysis and SEO audits - it finds broken links, missing alt text, dupliacte meta tags etc etc. I find the reports it generates quite helpful and easy to use. I've used it for keyword analysis too in the past, but as my site is up and running I don't really do a lot of this anymore - I know what keywords bring in traffic... I'm pretty sure that I only scratch the surface of it and that there is a lot more to be got from it, but I only have so many hours in a day!!!
I don't just use one tool though - I use a few at once and combine the findings. I don't expect one tool to do it all
Hope this helps,
Amelia
-
You are right! Thanks for your answer.
-
No such thing as best tool for seo. Seo is huge, so there are different tools for different purposes.
-
That tool is similar to Accuranker. If you have time, you should try it.
Regards
-
No, sorry.
-
I don't know Accuranker, I use CuterRank. ¿Do you know that?
-
Accuranker is cheap, and allows me to instantly see how a page ranks for new content, and for after content is updated. I find it invaluable, as you can submit a page to webmaster tools via fetch and then check rankings 5 minutes later to see the effect (noting, you may see an initial effect and then some days later get more movement).
Cognitive SEO is great if you need full backlink data, i.e. for link cleanups etc or if you get hit by negative SEO etc. Basically it is cheaper than signing up to ahrefs, majestic and moz. Although Moz is great for all the other tools.
-
And the last tools that you say to me, are they good?
-
I use MOZ, screaming frog, semrush, ahrefs, cuterrank, xenu (I have started since Monday) and I don't remember more.
-
I only use two currently:
- Moz
- Accuranker (great rank tracker where you can update your rankings instantly as many times as you want)
- Occasionally I will use cognitiveseo as this takes in Moz, Ahrefs and majestic link data as well as webmaster tools data from your account and is useful for cleaning up backlinks (I was hit by negative seo last year).
I have tried many other tools, but these are the ones I keep going back to.
-
What tools do you use, and maybe I could recommend some others that way
-
Thanks Andy. I know all of them but I am going to search new tools for SEO. I hope that I am lucky. Thanks for all.
-
Lots of thanks! I use more tools but I would like to try new tools. I know all of them but thanks for your answer.
-
Unfortunately there isn't a 1 amazing tool like everyone has said, its all about using multiple tools and using the best tool to meet the specific task you are trying to complete.
It also depends on what budget you have for SEO software, I am quite fortunate in that I have quite a lot of the software agreed by my MD, but I know some who really struggle.
I use:
- Moz
- Rankwatch - for rank tracking
- Ahrefs
- Searchmetrics
- SEMRush
- Majestic
- Screaming frog
And those are the ones i use frequently there a couple more i have that I cant remember of top of my head that I use for certain tasks and projects.
I wish there was a one best solution, all the above solutions are good, but not great in isolation and need to be used together.
Thanks
Andy
-
SEO tools are like personal assistants:
- you will never find the "best"
- they never do your job (only assist you/speed you up)
- if you have the budget to have more, you get more work prepared by them
That is my shortlist
I personally use Screaming Frog + Moz, + Majestic + Xovi
I hope this helps.
Gr., Keszi
-
Thanks for your answer
-
Hello Carlos,
There isn't something like the best SEO tool, it always depends on you. I suggest you to test some tools (most of them have a free test month) and decide what works out best for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool for user intent
Hello, Is there a tool that can tell me what the user intent of my keyword is and how I should present my page (the type of content users want to see it, what questions they want answered ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Best practices on setting up multi country Magento store
We run Magento and we're in the process of redesigning our site. We want the site to have separate storefronts for different countries, however we won't have the site language translated initially. We're thinking we'll use the Magento multi-store feature and have sites like /fr, /de /en-us, /en-au, etc. Is the best practice to use hreflang and for the non-english stores which haven't yet been translated? For example set them as, for French users: Essentially saying, the page is aimed at French people, but is in English. The separate storefronts will have things like currency and tax localised to each country and will gradually be getting translated, especially the more generic stuff like "Add to Cart", "Checkout" etc. Or, should it be targeted at French language and country, despite not all being translated into French? Or is there a better way to do this?
Intermediate & Advanced SEO | | seanmccauley0 -
What is best practice to eliminate my IP addr content from showing in SERPs?
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center). Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that). I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows. As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical. So here are my questions. Is there a better way? If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
Intermediate & Advanced SEO | | ovenbird0 -
Best approach for a client with another site for the same company
I have a client who has an old website and company A handles the SEO campaign for this site.
Intermediate & Advanced SEO | | ao500000
My client wanted us to create a new website with unique content for the same company aiming to double his chances of ranking on the 1st of SERP's and eventually dominating it.
So we created the new site for him and handled it's SEO campaign. So far we are ranking decently on the search engines but we feel like we could do better. The site we are optimizing for him uses the same company, tracking number and a virtual address in the same city.
Do you think Google has a problem with this set up?
We have listed the new site in the citation directories but I'm worried that we are sending google mixed signals. The company has two listing on each directories, one for the old site and another for the new site.
Another thing, Google+ Local for the new site is created and verified but is not showing up in local pack.
What is the best way to approach this mess?
We are looking into ranking for both local & organic results.0 -
Best method to update navigation structure
Hey guys, We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me. I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained. The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites. I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO) Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it. I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us. http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well Am I getting into trouble if I do any of the above, or is this the way to go? PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.
Intermediate & Advanced SEO | | CodyWheeler0 -
Canonical Meta Tag Best Practices
I've noticed that some website owners use canonical tags even when there may be no duplicate issues.For examplewww.examplesite.com has a canonical tag.......rel="canonical" href="http://www.examplesite.com/" />www.examplesite.com/bluewidget has a canonical tag.......rel="canonical" href="http://www.examplesite.com/bluewidget/" />Is this recommended or helpful to do this?
Intermediate & Advanced SEO | | webestate0 -
Best url structure
I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
Intermediate & Advanced SEO | | jlane90