Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google places VS position one ranking above the places.
-
Hi Guys,
Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword?
I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts?
Or would they have that listing as well as the places listing?
I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc.
Appreciate some guidance
Thanks.
BC
-
I have a client where we put the specific local listing page url (example.com/locations/phoenix/location1) in the Google Places URL field. It works out really well as we get the home page ranking organically (depending on the query) and the specific places result locally. Sometimes they are combined and other times they are not, but we are in the mix somewhere almost always.
-
Curious if anyone of you guys has experience pointing the places listing to a different URL other than the homepage?
I have read a few articles that stated various different outcomes, some mentioning that it didn't effect their Organic result, but was harder to rank the places URL. Just curious of findings!
-
Hi Bodie,
Yes, I think this is playing in the grey area. If the business owner actually wants to make his used and new car dealerships two companies with completely separate legal business names or DBAs, addresses with separate walk-in entrances, phone numbers and websites with completely unique content, then yes, you'd be talking about two different businesses, but that seems like an awful lot of real-world trouble to go to just to get a second Place page, eh? Chances are, a car dealership with both used and new cars is simply a single business with different specialties and should only be running a single website with a single Place/+ Local page.
What would happen if you went ahead with this plan, anyway, without the company actually being two legally separate entities? Honestly, you might be able to get away with it for awhile. Google is often not super sharp about upholding their policies and iffy stuff can ride for a long time. But...the risk is big. Should Google ever decide that they don't like what they are seeing, they could penalize or remove the listing from the index and if there is any association at all between the 2 listings, they could penalize the whole profile. This isn't a risk I would take for my clients, and for a business model like you're describing, like a car dealership, I would not advise the hypothetical approach you are considering. Rather, I would recommend that the client build the strongest local profile he can for his business and then consider other forms of marketing such as Social Media, Video Marketing, new content, development, etc. to continue to build additional visibility.
Hope this helps!
-
Think more along the lines of a car dealership with a 'NEW' and "used car' department? would i be pushing it ? My question to you is how would the association be made between the pages and businesses if the new site was branded differently and had a new address and a unique non associated domain? The only way i can think is if they were interlinked, but many non associated sites are linked. Is this playing in a grey area? Thanks again
-
Hi Bodie,
My pleasure. Are you stating that you work at a large business that has more than one front entry door for clientele (like a hospital with an emergency room and a separate radiology department?) If so, then you are allowed to create more than one listing for the business under the following Google Places Quality Guideline:
Departments within businesses, universities, hospitals, and government buildings may be listed separately. These departments must be publicly distinct as entities or groups within their parent organization, and ideally will have separate phone numbers and/or customer entrances.
If this is an accurate description of your business model, then I would simply have a single website with unique landing pages for the different public offices and tie these pages to the distinct Place Pages/+ Local Page for the business. Anything that doesn't really fit the above would not be a good idea.
I would not recommend associating an identical business name with two different websites and Place Pages if it is really the same business. What Google wants is for you to make a totally realistic representation of your business on the web; not to try to appear like you are larger, more diverse, or different than you really are in real life. I know how important it is to do all you can to gain the broadest visibility, but I believe that all efforts must be founded on an authentic presentation of any business, and this appears to be Google's view, too. Hope this helps!
-
Thanks for your response, would it be deemed black hat to set up a new site specifically for the Google places listing if it had a strong geo location in the URL and was attached to a different address?
ie website Hillarysrestaurant.com.au (ie hillarys is the suburb) and i was to register Perthrestaurant.com.au and attach that to a different address as the restaurant takes up 3 blocks ie 6-10 so i run the real website as it always was on 6 and set up the new site as a push site/squeeze page on 10 and use it just for google local?
i really hope this makes sense. Thanks again for your help and SEO wisdom!
P.s its not a restaurant im just using this as an example.
-
We have the same experience as Cody. Google Places is like ADDING another listing to the SERP. From what I understand the Google places, is supposed to rotate around. But your #1 or #2 spot should stay firm - unless you get knocked off by a competitor! We have several clients that are in #1, Google Places and then #4 or 5 - so it is possible to take up quite a bit of real estate on a SERP.
-
Hi BC,
Yes, you can typically expect the organic rank to be subsumed into the Places rank if you create a Google Places/+ Local page for the client. This is a very common outcome and it remains uncommon, though not impossible, for businesses to have more than one results per SERPs page.
-
I work with around 50 companies, and that's typically what I see. My #1 listing will just get changed to a Places listing, but it will still be in the #1 position.
-
In my experience, I had a client with the positioning like yours. We created the Places account and it just went into the local / maps results. The good news was that the SERP didn't contain any other organic listings at the top. If you have prominent and consistent rankings and are confident in your strategy, then you might not need to create a places account. Just be aware that moving down 1 spot could really be 8 or 9 spots on the real estate of the SERP. Moving down to #2 organically could mean being below the entire local results. You will need to judge the risk / rewards. Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I Report A SEO Agency to Google
Our competitor has employed the services of a spammy SEO agency that sends spammy links to our site. Though our rankings were affected we have taken the necessary steps. It is possible to send evidence to Google so that they can take down the site. I want to take this action so that other sites will not be affected by them again.
White Hat / Black Hat SEO | | Halmblogmusic0 -
How good/bad the exit intent pop-ups? What is Google's perspective?
Hi all, We have launched the exit intent pop-ups on our website where a pop-up will appear when the visitor is about to leave the website. This will trigger when the mouse is moved to the top window section; as an attempt by the visitor to close the window. We see a slight ranking drop post this pop-up launch. As the pop-up is appearing just before someone leaves the website; does this making Google to see as if the user left because of the pop-up and penalizing us? What is your thoughts and suggestions on this? Thanks
White Hat / Black Hat SEO | | vtmoz1 -
Do Ghost Traffic/Spam Referrals factor into rankings, or do they just affect the CTR and Bounce Rate in Analytics?
So, by now I'm sure everyone that pays attention to their Analytics/GWT's (or Search Console, now) has seen spam referral traffic and ghost traffic showing up (Ilovevitaly.com, simple-share-buttons.com, semalt.com, etc). Here is my question(s)... Does this factor into rankings in anyway? We all know that click through rate and bounce rate (might) send signals to the algorithm and signal a low quality site, which could affect rankings. I guess what I'm asking is are they getting any of that data from Analytics? Since ghost referral traffic never actually visits my site, how could it affect the CTR our Bounce Rate that the algorithm is seeing? I'm hoping that it only affects my Bounce/CTR in Analytics and I can just filter that stuff out with filters in Analytics and it won't ever affect my rankings. But.... since we don't know where exactly the algorithm is pulling data on CTR and bounce rate, I guess I'm just worried that having a large amount of this spam/ghost traffic that I see in analytics could be causing harm to my rankings.... Sorry, long winded way of saying... Should I pay attention to this traffic? Should I care about it? Will it harm my site or my rankings at all? And finally... when is google going to shut these open back doors in Analytics so that Vitaly and his ilk are shut down forever?
White Hat / Black Hat SEO | | seequs2 -
One page with multiple sections - unique URL for each section
Hi All, This is my first time posting to the Moz community, so forgive me if I make any silly mistakes. A little background: I run a website that for a company that makes custom parts out of specialty materials. One of my strategies is to make high quality content about all areas of these specialty materials to attract potential customers - pretty strait-forward stuff. I have always struggled with how to structure my content; from a usability point of view, I like just having one page for each material, with different subsections covering covering different topical areas. Example: for a special metal material I would have one page with subsections about the mechanical properties, thermal properties, available types, common applications, etc. Basically how Wikipedia organizes its content. I do not have a large amount of content for each section, but as a whole it makes one nice cohesive page for each material. I do use H tags to show the specific sections on the page, but I am wondering if it may be better to have one page dedicated to the specific material properties, one page dedicated to specific applications, and one page dedicated to available types. What are the communities thoughts on this? As a user of the website, I would rather have all of the information on a single, well organized page for each material. But what do SEO best practices have to say about this? My last thought would be to create a hybrid website (I don't know the proper term). Have a look at these examples from Time and Quartz. When you are viewing a article, the URL is unique to that page. However, when you scroll to the bottom of the article, you can keep on scrolling into the next article, with a new unique URL - all without clicking through to another page. I could see this technique being ideal for a good web experience while still allowing me to optimize my content for more specific topics/keywords. If I used this technique with the Canonical tag would I then get the best of both worlds? Let me know your thoughts! Thank you for the help!
White Hat / Black Hat SEO | | jaspercurry0 -
How do you change the 6 links under your website in Google?
Hello everyone, I have no idea how to ask this question, so I'm going to give it a shot and hopefully someone can help me!! My company is called Eteach, so when you type in Eteach into Google, we come in the top position (phew!) but there are 6 links that appear underneath it (I've added a picture to show what I mean). How do you change these links?? I don't even know what to call them, so if there is a particular name for these then please let me know! They seem to be an organic rank rather than PPC...but if I'm wrong then do correct me! Thanks! zorIsxH.jpg
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Negative SEO - Case Studies Prove Results. De-rank your competitors
Reading these two articles made me feel sick. People are actually offering a service to de-rank a website. I could have swore I heard Matt Cutts say this was not possible, well the results are in. This really opens up a whole new can of worms for google. http://trafficplanet.com/topic/2369-case-study-negative-seo-results/ http://trafficplanet.com/topic/2372-successful-negative-seo-case-study/ This is only going to get worse as news like this will spread like wildfire. In one sense, its good these people have done this to prove it to google its just a pity they did it on real business's that rely on traffic.
White Hat / Black Hat SEO | | dean19860 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0