RE: Keep Losing Keyword Ranking Position for Targeted Keyword Terms Can't Figure It Out, Please Help!!!
-
Hey Mozzers,
I am pulling my hair out trying to figure out why one of my clients keeps losing their SERP for their targeted keyword terms. We're actively pursuing local citations, making sure their NAP is consistent across the board and refining on-page content to make sure that we're maximizing opportunities.
The only thing I've found is a 4xx error that my Moz 'crawl diagnostics' keep returning back to me, however, when I check to see if there's any problems with Google Webmaster Tools, it doesn't return any errors.
Is this 4xx error the culprit? Are there any suggestions any of you could give me to help me improve the SERP for my targeted keyword terms.
Anyway, any and all insight can help. I'm at my wits end. Thanks for reading and for all of your help!
-
Hi J.P.!
I'm glad Matt is helping you; he knows his stuff. There's actually a Moz tool that could provide some insight, too—the Keyword Difficulty tool. It can give you an analysis of all the sites ranking 1-10 for any given keyword, including which factors are most likely to be helping them. You may be able to glean what's helping them pull ahead.
I'd recommend Cyrus's video on the tool.
Good luck!
-
Got your PM & replied. You have mail.
-
What site & keyphrase are you looking at? Happy to take a look for you but without more info, we can't check backlinks, can't look at your onsite and can't see what else may be affecting you. Technical issues are typically very site-specific so they would depend on your site itself.
If you can't publicly post the URL, happy to get a PM but would rather solve here if possible so more people benefit. Either is fine though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Business has multiple locations, but want to rank for commutable cities, geographies
Hello, The business I am working for has multiple locations, but the service they provide is one that you would commute for. At present, they have 20 or so pages with yucky geographical keyword stuffed content (think "New York computer services" and they are based out of a suburb (maybe 40 miles away). For some ridiculous reason, some of these pages are ranking for exact match search terms? We are in the process of revamping the whole site-taking approx five sites and integrating into one mega site. I want to first, figure out the best strategy for ranking for the region that each is in and serve, without being spammy like the previous SEO. I want to eliminate the spammy pages without losing the rank and link juice. What is the most appropriate and above-board strategy? These are my thoughts. Should I: 1. Keep the pages, but tweak them enough to make the content quality? If I do, should they be geo pages? Should they be "locations served", statistics of the area, etc? 2. Group the pages according to region (one page per region) that are location-oriented and tweaked to still include the terms they were ranking for (without the spammy look and stuffing), along with a map, etc? And then, I have to figure out how to redirect so not to lose the value we have now for some of them. The company deals with treatment for addiction, so in recommending and tips-remember that our audience will commute by car, and eventually (hopefully) by plane. 😉 Thank you so so much for any and all help you can provide! Sorry for such a long description!
Local Website Optimization | | lfrazer1231 -
Creating a subdomain for IP targeting based on city
We are currently located in OKC and are opening a new location in Dallas. After much research, I found the best way to do the website is to create a subdomain a redirect people based on their IP location so our current SEO will help give substance to the new location. My question is, should I recreate the whole website under this subdomain using Dallas instead of OKC throughout or should I just recreate 1 or 2 pages? This is all very new to me and I need as much help as I can get lol.
Local Website Optimization | | KylieM0 -
How accurate are google keyword estimates for local search volume?
We've all used the Google Adwords Keywords Tool, and if you're like me you use it to analyze data for a particular region. Does anyone know how accurate this data is? For example, I'd like to know how often people in Savannah, Georgia search for the word "forklift". I figure that Google can give me two kinds of data when I ask for how many people in Savannah search for "forklift". They might actually give me rough data for how many people in the region actually searched for the term "forklift" over the last 12 months, then divide by 12 to give me a monthly average. Or they might use data on a much broader region and then adjust for Savannah's population size. In other words, they might say, in the US people searched for "forklift" and average of 1,000,000 times a month. The US has a population of 300,000,000. Savannah has a population of about 250,000. 250,000 / 300,000,000 is 0.00083. 1,000,000 times 0.00083 is 208. So, "forklift" is searched in Savannah an average of 208 times. 1. is obviously much more accurate. I suspect that 2. is the model that Google is actually using. Does anyone know with reasonable certainty which it is? Thanks,
Local Website Optimization | | aj613
Adam0 -
Theory: Local Keywords are Hurting National Rankings?
I've read a good amount here and in other blog posts about strategies for national brands to rank locally as well with local landing pages, citations, etc. I have noticed something strange that I'd like to hear if anyone else is running into, or if anyone has a definitive answer for. I'm looking at a custom business printing company where the products can and are often shipped out of state, so it's a national brand. On each product page, the client is throwing in a few local keywords near where the office is to help rank for local variations. When looking at competitors that have a lower domain authority, lower volume of linking root domains, less content on the page, and other standard signals, they are ranking nationally better than the client. The only thing they're doing that could be better is bolding and throwing in the page keyword 5-10 times (which looks unnatural). But when you search for keyword + home city, the client ranks better. My hypothesis is that since the client is optimizing product pages for local keywords as well as national, it is actually hurting on national searches because it's seen as local-leaning business. Has anyone run into this before, or have a definitive answer?
Local Website Optimization | | Joe.Robison2 -
How can i optimize my pages for local areas if we are not in that area?
Hi Mozers! So I watched a video about Matt Cutts he talks about creating multiple web pages just for one keywords is an absolutely no go. So I was wondering we serve a clients in NZ Australia and USA, If we target phrase like Psychic Readings California, Psychic Readings San Diego etc (USA) Psychic Readings Melbourne, Psychic Readings Sydney (AU) Psychic Readings Auckland, Psychic Readings Wellington (NZ) What is the best practice or right way to go about structuring my pages to do this without going against googles guidelines. Many thanks
Local Website Optimization | | edward-may1 -
Hosting Change & It's Impact on SERP Performance (with a Side of Domain Migration)
Hi everyone, I've read a lot on forums about the topic of hosting and it's impact on SEO, but I've seen conflicting opinions. I wanted to see if anyone might have a definitive answer for this scenario: Our parent company is based in the EU and wants to move our English domain to their site -- either as part of the main .com or potentially as a new subdomain. One of those things is going to happen; it's just a question of which one. One issue I have is that they host their .com with content targeting English speakers (mostly in the U.S.) in France, so if we moved our content to their site we'd be going from our existing domain hosted in the U.S. (with the majority of visitors coming from the U.S.) to a site that's hosted in France. I've read that folders are still usually better over subdomains in terms of passing the strength of the domain on to pages. So... would it be better to have a subdomain hosted in the U.S., or just have folders under the main domain, but that content would be hosted in France? Our existing domain and the domain we'll be moving to are about even in terms of domain authority and size. Happy to get any feedback you might have. Anyone come across any case studies on this particular topic that would be helpful? Thanks!
Local Website Optimization | | SafeNet_Interactive_Marketing0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
How to rank in Local Google Without physical address and phone number?
Can We Rank Well in Local Google without Physical address and Phone Number??? If Yes. How??
Local Website Optimization | | Dan_Brown10