Google Places......
-
Hi guys,
Anyone come across a problem with Google places and it impacting organic search results?
I have a client the was ranking top 3 on Google for all local keywords (letting agent). They have set up Google Places and now appear with a nice Google places ad but the organic position has now disappeared....Gone down to page 8 and nothing.
Any ideas?
Ta
-
We encountered a similar issue recently. We had two pages ranking on Google's front page (lets call them page A and page B). Page A was in the local bucket...page B was below the local bucket in the top 10. We updated our Google Places page to point to the URL of page B. Google moved page B into the local bucket and dropped page A altogether. However, page B did no better in the local bucket than page A.
We have since reversed this decision.
-
Hey Lawrence,
It may not be a problem at all, at least not by design. Google began merging organic information with Google Places a while back. If you notice that the title of the Place page on the results page is the same as that of your website page that ranked, this happened.
And when that happens, short of a few cases I have right now it always takes place of the organic listing. Meaning that you only get the one.
In some cases this is ok, as the Google Places page is taking the top spot of the results page. I've also noticed an increased conversion rate depending on the service.
-
Lawrence,
First I would ask: where are they ranking in Places - page one, number 3, etc.? (The change may not be negative, but need more info.) Has traffic, calls, leads, etc. changed? On the Places ad, did you change anything re categories, address, phone, etc.? Was the organic listing around a different area, etc.?
More info really is needed to dissect this: urls, target area, etc.
If you can supply those you will get more assistance as there are several really good local SEO pros on moz.
Best
Edit: Lawrence, I wanted to find some substantiation before I said this, but there is a lot of wait attached to a verified listing: From Google Places for Business:
With individual listings, once the listing is verified, your information will be trusted more than conflicting information from any other sources. I would really look at the site and see if anything is conflicting with what you have in Places. I am assuming the Places accurately links to the site url.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do the header tags must be placed from top to bottom order?
Generally tags will be placed starting with h1, then h2, h3 and h4.... Some of our pages starts with h3 and h1 is placed after couple of h2 and h3 tags. Is this a bad placement which hurts in SEO?
Algorithm Updates | | vtmoz1 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Google Adding / Manipulating Page Meta Titles?
We have a client who is experiencing some heavy google modification to the title tags being displayed on the search engine. It is adding "- 0 Reviews" to an ecommerce site. Obviously a bad start. There were no instances of these keywords anywhere on any of these pages, header tag or otherwise (on only a handful of the affected pages there was a single commented out image with an alt tag 0 reviews - but it was commented out and since removed) We have attempted to rewrite the title multiple times and it will modify the title but still include the non-relevant addition. Has anyone ever experienced anything like this?
Algorithm Updates | | Spindle0 -
Did Google just give away how Penguin works?
At SMX during the You&A with Matt Cutts, Danny asked why the algo update was called Penguin. Matt said: "We thought the codename actually might give too much info about how it works so the lead engineer got to choose." Last night Google released their 39 updates for the month of May. Among them was this: "Improvements to Penguin. [launch codename "twref2", project codename "Page Quality"] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm." Whoa, codename twref2 for Penguin improvement? Is this giving us an insight about how it works? I would guess the ref2 means second refresh perhaps. But tw I am not sure about. What do you think? Is there a hidden insight here?
Algorithm Updates | | DanDeceuster1 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Google decreased use of Meta Descripiton Tag
Over the past month or so I have noticed that Google is not using the meta description for my pages but is instead pulling text from the actual page to show on the SERP. Is Google placing less emphasis on meta descriptions?
Algorithm Updates | | PerriCline0 -
How Google Determines Sitelinks
Does anyone have authoritative information on how Google determines which links to use as sitelinks? I thought I saw that Top Landing Pages was a metric Google used (in part).
Algorithm Updates | | joshfialkoff-778630 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0