Most efficient way to analyze kw difficulty for thousands of kws?
-
Been trying to extract kw difficulty for a list of 1400 kws and wondering what the most efficient way to do this is? Any links to tutorials would be much appreciated.
-
One tool I occasionally find useful for larger lists is market samurai. It's a free keyword analysis tool much loved by sectors of the market that don't frequent seomoz as much. Can be useful though.
The tool itself is actually a paid one, but you can use it in free mode and get a few features. All the keyword research stuff is included in the free version and is actually more useful than most of the paid stuff!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to rel = canonical only part of a page?
Hi guys: I'm doing SEO for a boat accessories store, and for instance they have marine AC systems, many of them, and while the part number, number of BTUs, voltage, and accessories change on some models, the description stays exactly the same across the board on many of them...people often search on Google by model number, and I worry that if I put rel = canonical, then the result for that specific model they're looking for won't come up, just the one that everything is being redirected to. (and people do this much more than entering a site nowadays and searching by product model, it's easier). Excuse my ignorance on this stuff, I'm good with link building and content creation, but the behind-the-scenes aspects... not so much: Can I "rel=canonical" only part of the page of the repeat models (the long description)? so people can still search by model number, and reach the model they are looking for? Am I misunderstanding something here about rel=canonical (Interesting thing, I rank very high for these pages with tons of repeat descriptions, number one in many places... but wonder if google attributes a sort of "across the site" penalty for the repeated content... but wouldn't ranking number 1 for these pages mean nothing's wrong?. Thanks)
Intermediate & Advanced SEO | | DavidCiti1 -
Is there an automated way to test what HREFLANG is ranking for in google and yandex?
Hi everyone, We implemented HREFLANG code for our international sites. We are wondering is there an automated way to test is HREFLANG is working vs. manually browsing in each international search engine? Also, we implemented this a few days ago, and google webmaster tools stlil hasn't picked up that we have it implemented. I've heard it taking anywhere from 2-8 days. At what point would we see results. our site is http://www.datacard.com Is there an order that the site listings have to follow, for example should x-default be the last item listed? Thanks, Laura
Intermediate & Advanced SEO | | lauramrobinson320 -
Is there a way to make Google realize/detect scraper content?
Good morning,Theory states that duplicated content reduces certain keywords’ position in Google. It also says that a web who copy content will be penalized. Furthermore, we have spam report tools and the scraper report to inform against these bad practices.In my case: the website, both, sells content to other sites and write and prepare its own content which is not in sale. However, other sites copy these last ones, publish them and Google do not penalize their position in results (not in organic results neither in Google news), even though they are reported using Google tools for that purpose.Could someone explain this to me? Is there a way to make Google realize/detect these bad practices?Thanks
Intermediate & Advanced SEO | | seoseoseos0 -
Proper way to include Location & Zipcode Keywords
I have a client that is insisting that I add a list of approximately 50 cities and 80 zipcodes that their business serves within the keyword meta tag. Based on what I have been reading this will do absolutely nothing to help improve their search ranking. What would be the proper way today to let inform search engines of the geolocations a business serves?
Intermediate & Advanced SEO | | mmurphy0 -
Safest way to launch a redesign
Hey MozFolk, I was wondering what the best and SAFEST way to handle this situation is; I am doing a redesign of our current website, but the new site will have different content. Should we just forward the entire root domain in the HT Access file? Or redirect each and every URL using a 301? I know these, terms but never actually done it myself, and cannot risk losing the SEO weight of this website. How do I handle a group of pages that they don't want to continue to use also? Do I just leave those URLs be, or do I forward all of them to one new page (or homepage) on the new site? Please help me look like a rockstar and save the ship from sinking itself!
Intermediate & Advanced SEO | | DerekM880 -
Best Way to Consolidate Domains?
Hello, My company has four websites in the same vertical and we're planning to integrate them all on our main company site. So instead of www.siteone.com, www.sitetwo.com, www.sitethree.com, etc. It would be www.branddomain.com/site-one, www.branddomain.com/site-two, etc. I have a few questions... Should we redirect the old domains to the new directories or leave the old domains and stop updating them with new content... Then have the old content, links, etc. 301 to the same content on the new site? Should we literally move all of the content to the new directories? Any tips are appreciated. It's probably pretty obvious that I don't have a ton of technical skills... my development team will be doing the heavy lifting. I just want to be sure we do this correctly from an SEO perspective! Thanks for the help, please let me know if I can clarify anything. E
Intermediate & Advanced SEO | | essdee0 -
Wich is the best way to manage dup content in a intenational Portal?
We have a portal wich is only in spain and we started to internazionalized it to Argentina, Mexico and Colombia. Before we had a .com domain with content only for spain and now that domain is going to be global. so.. .com contains all the content and you can filter for country .es contains spanish content .com.ar contanis argenitian content Every thing is ok but the problem is that there is a content (online courses) that is in every country. What we thougt to do is: -online contect url canonical to .com domain -Geo content url canonical to .es, .com.ar domain (depending on the geo) Filters besidese .com and .es can give similar resoults we do not use canonical url or we will follow the rule above (if there is geo in .com filter then canonical to geo domain and if the filter is (online courses) then canonical to .com domain) What do you think about that? Thank you in advance.
Intermediate & Advanced SEO | | ofuente0 -
Hundreds of thousands of 404's on expired listings - issue.
Hey guys, We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000. Many of these listings receive links. Classified listings that are less than 45 days show other possible products to buy based on an algorithm. It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results. -> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised. -> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience. -> Or, shall we just leave them as 404's? : google sort of says it's ok Very curious on your opinions, and how you would handle this. Cheers, Croozie. P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
Intermediate & Advanced SEO | | sichristie0