Weird SEO Problem - No Longer Ranking in Some Areas
-
Hi Everyone, I’ve got a weird SEO issue that I hope you’ll be able to help with. I’ve broken it down in to the key points below:
-
Impressions for our primary and secondary keywords dropped dramatically on 02.10.17.
-
Impressions have only dropped on non geographical keywords. “UK” variants are still ranking well. Investigation shows we’re not ranking outside of London at all for primary and secondary keywords.
-
Primary and secondary keywords are still ranking well in London, the city where we’re based
-
We’ve looked at our competition who do rank for the primary keyword both in and outside London. We noticed we have our “postaladdress” in our schema. The competition don’t have their address in their schema.
-
We updated our schema 2 weeks ago and now use the Yoast schema which is the same as our competitors use.
-
Approx 1 week after removing the schema we started showing up for primary and secondary keyword again, but very low - fluctuating between page 15 and page 24. It’s been 2 weeks now and no improvement.
-
AHREFS and google webmaster, both incorrectly detail that we rank top 5. Which is true to a degree, but only in London.
Thank you in advance!
-
-
Good for you, Rswhtn, for trying to get your key points into a list. I'm going to agree with Andy here: this list is something you need to take to company that does both local and organic SEO for a real audit. Trying to guess at this, without looking at your timeline, analytics, Google Search Console, competitive landscape, etc., is just going to be making random guesses as to why whatever has happened to your specific business has happened. You could be dealing with filters or penalties, you could be dealing with Google more highly localizing organic results in other places leading to you being edged out of anything but organic rankings for your own city, there could be a technical issue with your robots.txt. It could be so many things. The loss of traffic seems like enough of a reason to get a real audit going for this, so that no more time is being lost with your revenue being impacted.
-
I honestly would love to help with this, but there are just so many variables at play, it is almost impossible to do without an audit of some kind.
There is so much going on that anything thrown at you would be an out-and-out guess, which could land you in a wild goose chase.
However, it does sound similar to a recent client who were actually battling against a penalty due to over optimisation, lack of unique content, similar pages and a less than satisfactory user experience.
However, this only became apparent after looking at the site in depth to find this.
-Andy
-
Hi, yes this is similar to my previous question however I think the amount of detail I gave previously proved a little difficult to follow. I've simplified the question here and have intentionally tried to get a more generalised response based on these facts.
If anyone is able to take a look and let me know their thoughts it would be appreciated.
-
Hey There!
I see that you started a previous thread on this topic that received an answer: https://moz.com/community/q/mysterious-location-based-serp-issue
Is there something about your question that wasn't answered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Checking subdomains/ site structure of a website for International SEO
Dear Moz community, I am looking into two websites for a friend and we want to understand the following: What is the site structure as per the sub domains? e.g currently it is .com/en/ or .com/ru/ or .com/zh/ Using the crawl report, each page has a en or other language version. I take it this means that we have to create copy, meta titles and descriptions for each of the languages even if the page is the same but in a different language? To avoid duplication of content would you suggest canonical tags to be put in place? To check hreflang mark up, I couldn't find anything in the code which makes me thing a script is automatically translating this? This is the first time I have started to look at international SEO and want to understand what to look for in an audit of exisiting sites. Thank you,
Local Website Optimization | | TAT1000 -
Does having a host located in a different country than the location of the website/website's audience affects SEO?
For example if the website is example.ro and the hosting would be on Amazon Web Services. Thanks for your help!
Local Website Optimization | | IrinaIoana0 -
Duplicate Content - Local SEO - 250 Locations
Hey everyone, I'm currently working with a client that has 250 locations across the United States. Each location has its own website and each website has the same 10 service pages. All with identical content (the same 500-750 words) with the exception of unique meta-data and NAP which has each respective location's name, city, state, etc. I'm unsure how duplicate content works at the local level. I understand that there is no penalty for duplicate content, rather, any negative side-effects are because search engines don't know which page to serve, if there are duplicates. So here's my question: If someone searches for my client's services in Miami, and my client only as one location in that city, does duplicate content matter? Because that location isn't competing against any of my client's other locations locally, so search engines shouldn't be confused by which page to serve, correct? Of course, in other cities, like Phoenix, where they have 5 locations, then I'm sure the duplicate content is negatively affecting all 5 locations. I really appreciate any insight! Thank you,
Local Website Optimization | | SEOJedi510 -
Site Not Rankings After a Few Months
I have a client site that I am beating my head against the wall for right now. Three months into a 100% white hat campaign, we can't get him ranking in the top 150. Here's the cliffsnotes: Built a new wordpress website All on page SEO has been done and score an A+ for his primary kws Robots.txt is setup correctly .htaccess is setup correctly new domain multiple 95 DA, 50 PA links from reputable, national sites. Yext Local listings SSL, CDN, Speed optimized Has 19 pages indexed by Google Posting one blog a week for him Granted his primary keyword is a hyper competitive kw, but still, I've been doing this for 8 years and never seen a guy be stuck on the 16th page for so long for the sort of links we are building him. I'm genuinely stumped here and could use some help.
Local Website Optimization | | BrianJGomez0 -
Local SEO - Multiple stores on same URL
Hello guys, I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL. What do you think? What's the best way and why? Thank you in advance.
Local Website Optimization | | Noriel0 -
Drastic changes in keyword rankings on a daily basis
Anybody ever seen keyword rankings for a site change drastically from day to day? I've got a client, a local furniture store, whose local keywords (furniture + city) rank consistently well without much change, but when it comes to broader keyword rankings (like "furniture" or "furniture store") in their zip code, they'll go from ranking at the top of Google one day to not being ranked at all the next (at least according to Raven Tools). My best guess is that it's just a reflection of personalized results from Google, but such a dramatic change day in and day out makes me wonder.
Local Website Optimization | | ChaseMG0 -
Listing bundle info on site and on local SEO page.
We just finished a new telecom site, and like all telecom sites (think AT&T, Verizon, Suddenlink, etc.), we allow people to put their location in and find internet and phone service packages (what we call bundles) unique to their area. This page also has contact information for the local sales team and some unique content. However, we're about to start putting up smaller, satellite pages for our local SEO initiative. Of course, these pages will have unique content as well, but it will have some of the same content as what's on the individual bundle page, such as package offerings, NAP, etc. Currently this is the URL structure for the bundles: domain.com/bundles/town-name/ This is what I'm planning for the local SEO pages: domain.com/location/town-name-state/ All local FB pages, Google listings, etc. will like to these location pages, rather than the bundle pages. Is this okay or should I consolidate them into one?
Local Website Optimization | | AMATechTel0 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0