What's your proudest accomplishment in regards to SEO?
-
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables.
Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before.
Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month.
The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally.
So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal.
That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it.
Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise.
Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too.
You can see it for yourself here.
-
increasing our DR from 48 to 62 in the last year, the pandemic has really changed the way we win business, so ive spent A LOT of time reading, researching and back linking and it's paid off
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Daytona Beach Web Design vs. Daytona Web Design: What's Best?
Three months ago we had our team create local pages for some of the services we render -- _i.e., _web design. As we reviewed the pages, they created two pages with similar content; one with URL: /daytona-beach-web-design/ & /daytona-web-design/ We knew we had to kill one of them to avoid duplicate content. Here is where the hard decision came and hence the question. We though about keeping the '/daytona-beach-web-design/ ' URL but for some reason, Google had already crawled the shorter version of the URL '/daytona-web-design/' So we ended up deleting the long tail URL and kept Daytona Web Design instead. Which one would you keep and have you experienced similar issues?
Local Website Optimization | | WebDaytona0 -
Need SEO Opinions
SEO Campaign and Best Practices Hello, I have dabbled with MOZ before and I’m coming back with a project that is going to be a marathon. I have a very general question for some of you SEO professionals. I would like to lay out a scenario and I would appreciate any feedback that you can offer. Let’s say I have a client which is an office cleaning company that just services the local Chicago area, let’s call them ABC123 Commercial Cleaners. They have a website that is Example: abc123commercialcleaners.com. On the website it explains all about their services with several different pages. If we want to start a site on another URL, let’s say officecleaningchicago.com will this hurt our online reputation? Or will it make us stronger? Our plan on paper is to target the different localities around Chicago Example:
Local Website Optimization | | Scott-Jones
napervillecommercialcleaners.com
officecleaningchicago.com
evanstonofficecleaning.com I just wanted to get opinions on whether this is a good or bad idea. If it will strengthen or weaken our brand. I hope I have explained a very general scenario (which is completely fictitious) I would appreciate any feedback that you can offer.1 -
Is it deceptive to attempt to rank for a city you're located just outside of?
I live in Greenville, SC (who has a large "Greater Greenville" reach). I work for an agency with many clients who are located just outside of the city in smaller towns, sometimes technically in counties other than Greenville. Often, they provide services in the city of Greenville and aim to grow business there, so we'll use "Greenville, SC" throughout site copy, in titles, and in meta descriptions. Are there any negative implications to this? Any chance search engines think these clients are being deceptive? And is it possible these clients are hurting their ranking in their actual location by trying to appear to be a Greenville-based company? Thank you for any thoughts!
Local Website Optimization | | engeniusbrent1 -
Impact of .us vs .com on SEO rankings?
Our website is hosted on www.discovered.us. I have 2 questions: 1: we have had regular feedback a .us domain is negative in SEO and in conversion (customers don't like it). We are thinking of changing domain to: www.dscvrd.com.
Local Website Optimization | | Discovered
Any insights on the impact on our rankings (if any) if we do this? 2: we are focusing our SEO global / USA first but conversions in UK are better. We currently do not have multi-language SEO setup. What would the impact be of implementing www.discovered.co.uk on SEO in UK? Thanks! Gijsbert0 -
Ideas on creating location based service pages for SEO value while not worrying about local SEO?
Hello and thanks for reading! We have a bit of a rare issue, where we are a nationwide distributor but have a local side that handles all tristate area requests, the sales that happen via local basically don't impact the online side, so we're trying to not focus on local SEO but in a sense worry about abroad local SEO. We want to try the location based service pages, but not for every state, at most 5 states and inside those pages target 2 to 3 big cities. Is this a waste of time to even think about or is this something that can be done with a careful touch?
Local Website Optimization | | Deacyde0 -
Which is the best, ".xx" or ".com.xx" in general and for SEO?
Hi, I'm working for a digital marketing agency and have traffic from different countries. We are planning to make different websites for each country. What is the best SEO practice to choose the domain between ".xx" or ".com.xx" from Spain, Mexico, Chile, Colombia and Peru?
Local Website Optimization | | NachoRetta
I think that the ccTLD is better always, for example ".es" better than ".com.es"0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
URL structure for local SEO
Hi fokes, question; which url structure is best for local rankings. For example: when I want to rank on the keyword: "Plumber Londen". And I dont have plumber in my brand. What is the best url structure: example.com/plumber/londen example.com/plumber-londen
Local Website Optimization | | remkoallertz1