Google Places Problem
-
This may have been answered before but I have 2 questions.
When I placed a business in Google Places, the "generic" ranking fell off the map. I now just have the 1 line Google places reference and that is all I can find. How can I get around that and get my 4 line description to show again?
Do I have to delete my Places account? Before the Google Places account was built, the company was moving up the SERP ranks, now he is on pg 1 for Places but the other SERP positions have disappeared. This is true for all the keywords we are targeting. If there is not a Places reference he shows on Pg 3-5 (given the website is 4 weeks old, I think this is not bad).
For the same client, he that services many of the surrounding communities. How do I get Google to recognize the various towns he services during a search? He places well for his "home" town but not at all for the other towns.
if it helps any, the website is www.myairstat.com.
Thanks for the help.
Scott
-
Hi Scott,
Colin is correct as to the SERP behavior you are seeing after claiming your client's Google Place/Google+ Local page. The client's previous organic rank will be subsumed into his new blended local rank. Once upon a time, it was common for dominant businesses to have more than one listing per page for a targeted keyword phrase. In fact, they might have one or two organic spots, a local spot, a video, directory listings for their company and what have you, if they were truly dominant. Around the time of the Venice update, this changed and it became next-to-impossible to find any business with more than one spot in the top 10.
So, this is something that's important for you to understand, working in Local. Either you have an organic listing or a local one, but seldom both.
Exceptions to this: for some searches in categories or geographic regions where Google has very little data they trust, a business will sometimes have more that one spot in the SERPs. For example, let's say your client is the only muffler repair shop serving 10 tiny towns in the country. If he's got a decent website, he might end up with more than one spot on page one, but that's typically the main case in which you'll see this exception these days.
Post-Venice, some Local SEOs did do some experiments with trying to get double page 1 rankings. Frankly, I'm not sure how well these tactics are working in 2013, but you might want to give this a read:
http://www.nightlitemedia.com/2012/05/organic-and-google-places-ranking-on-page-1/
Regarding your other question, it's critical to understand that Google views your client as being relevant to his city of location - not the cities where he serves. It's all about physical address. This has led to the development of the practice of creating city landing pages with the purpose of gaining ORGANIC (not local) rankings for service radius cities for go-to-client business models. I suggest you read this piece I published recently on this:
The Nitty Gritty of City Landing Pages
http://www.solaswebdesign.net/wordpress/?p=1403
Read that, and you'll be totally up to date on your client's options in this regards.
Hope this helps!
-
Hi Scott - does the url on your places listing point to the same page that dropped out of the organic serps? This is pretty normal, Google just doesn't give an organic and local spot to the same page of a website. But the authority of that page is still helping you earn your places rank.
For phrases that have any competition at all, it's nearly impossible to appear in the local pack with an address that doesn't correspond to the city of search. It's just kind of the way local works.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tag Clouds in Google Despite Canonical Links for Single Tags/Articles
I am frustrated to see a lot tag clouds in Google even though I programmed my tagged pages to display a canonical link to the linking article if the is only one result for the tag cloud. The goal to to make sure that the article, which is of better quality than the tag page, ends up in Google without a bunch of thin tag pages getting in there. For instance this article should be in Google and this tag should not be because that tag has a canonical URL for that article. I do not have a lot of experience with tag cloud SEO because I prefer to limit such pages to categories, but I have found tag clouds to be important for aggregating information for specific issues, people, or places that are not already a site category. Some tags I have used to power social media pages that update automatically from RSS feeds for their related tag archives. That is quite useful for pages like that. Should I start using Meta noindex for those instead of rel canonical? I have already done that for author profiles because author profiles get a lot of on site links compared to individual articles because my gridviews use javascript for paging. The same is true for the tags, so if a tag is tagged in 30 articles it will have links from 30 articles but if those articles are not in the latest 20 for that tag only the latest 20 will have links back from the tag archive. I also suspect having a lot of tag pages with little content to negatively impact my indexing rate. I will see a number of recent tag pages added before new articles.
On-Page Optimization | | CopBlaster.com0 -
Google Search Console issue: "This is how Googlebot saw the page" showing part of page being covered up
Hi everyone! Kind of a weird question here but I'll ask and see if anyone else has seen this: In Google Search Console when I do a fetch and render request for a specific site, the fetch and blocked resources all look A-OK. However, in the render, there's a large grey box (background of navigation) that covers up a significant amount of what is on the page. Attaching a screenshot. You can see the text start peeking out below (had to trim for confidentiality reasons). But behind that block of grey IS text. And text that apparently in the fetch part Googlebot does see and can crawl. My question: is this an issue? Should I be concerned about this visual look? Or no? Never have experienced an issue like that. I will say - trying to make a play at a featured snippet and can't seem to have Google display this page's information, despite it being the first result and the query showing a featured snippet of a result #4. I know that it isn't guaranteed for the #1 result but wonder if this has anything to do with why it isn't showing one. VmIqgFB.png
On-Page Optimization | | ChristianMKG0 -
Google search result dramatically dropped with drop in DA.
It looks like on 11/13 by site traffic dropped by like 75% and it just happens to coincide with the MOZ DA dropping to. Anyone else see this?
On-Page Optimization | | Motom70 -
Google Indexing Wrong Title
Hey guys ! I have a wordpress website and also yoast seo plugin . I've set up a meta title which is : TV Online | Assistir Filmes| Notícias | Futebol |GogsTV . (I checked on some free tools to see , and they also show up this) but .... google is showing this : GogsTV: TV Online | Assistir Filmes| Notícias | Futebol . Seems they are trying to show my brand name first instead of my main keyword . I'm not sure why it doesnt indexes as i want ... Does anybody know how can i fix this . Thanks
On-Page Optimization | | tiagosimk0 -
Some of my pages have KWs in common and google is switching those resulting in rank loss, how to fix that?
Hello, my site deals with pictures you can download and print for free. Because people search for it I have several page with similar content. For example: picture of dog free printing image of dog free printing painting of dog free printing picture of cat free printing image of cat free printing painting of cat free printing Just as an example. Now after tracking my ranks I found out that google changes those sites in the rankings. For example for the kw picture of dog free printing the site for picture of dog free printing is in the rankings but on another day it is image of dog free printing and on another painting of dog free printing. So those Google is switching the pages for one kw some times and it seems each time it does that I'm loosing rankings. I'm thinking it is because image of dog free printing and painting of dog free printing isn't as well optimized as the page that should appear: picture of dog free printing. With the more general kw of free printing it is even more complicated. I get some traffic through free printing (just an example) and all 6 example pages are being shown in the google search results from one day to another and again it looses some rank each time. What can I do to stop this?
On-Page Optimization | | SeeSharp10 -
Can't canonical, but need pages to show in Google News
We are a news media site in which much of our content is third-party, and already published by several other sources. Our current version of our CMS doesn't expose head tags, so I can't canonical to the original and avoid a duplicate content penalty. Is it ok for news sites NOT to use canonical, or do I have to NOINDEX until our CMS is fixed?
On-Page Optimization | | Aggie0 -
What´s the penalization extent applied by Google?
Hi! I still don´t get this web site penalization applied by Google due to duplicate content. My site has many of pages that were among the first positions for top keywords (A Photoshop web site). Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs. After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found. I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site. So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages? I´m not complaining. I´m trying to understand this. Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine? That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content? Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me. Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this. Thanks, Enrique
On-Page Optimization | | enriquef0 -
Cononical URL problem
Hello, In the keywords difficulty tool, one of my client's sites has a different number for links to root domain and links to page even though they're both the home page. How do I tell if I have a cononical URL problem, and what do I do about it if I do? Thank you.
On-Page Optimization | | BobGW0