How to optimize Google Places listing?
-
I have a couple friends who need more traffic from their Google Places listing. They are in very niche businesses. One is a home remodeling contractor and one is a music teacher.
Does anyone know some good services to help optimize their listings? I feel I've helped them out as much as I can by getting real reviews to their Google Places page. But I feel that someone who's had the experience in Google Places optimization can pick up on something I'm overlooking.
-
Hi RicktheMarkt,
The main components of a strong Google+ Local page are that:-
There are no violations of the Google Places Quality Guidelines
-
You've chosen the best possible categories
-
All relevant fields have been filled out with correct information
-
You are earning reviews, and if the page is fully merged, adding posts
Beyond the existence and correctness of the Google+ Local page, there are many, many more factors that contribute to local rankings. These include, but are not limited to,
-
The authority of the website
-
The quality, quantity and accuracy of citations on third party sites (local business indexes, directories, etc.)
-
Traditional SEO metrics like age of site, quality and quantity of links
-
Social Media participation
In order to begin getting a handle on all of the components that contribute to local search rankings, I recommend that you and your friends study the Local Search Ranking Factors 2013 survey. There is no better place to get the big picture of all the contributing factors.
Now, all this being said, I believe that your friends may need to adjust their stated goals. You write:
"I have a couple friends who need more traffic from their Google Places listing"It's very important to understand that there was a very big change recently with Google's display of the Places/+ Local pages. In the past, a single click from the main SERPS would take you directly to a business' +Local page. This is no longer the case. Now, if a company's listing shows up, their will most commonly be a 'review' link attached to it. In the past, this would have taken a user to the +Local page for the business. These days, it only takes them to a popup showing the reviews. In other words, it is now much more unlikely that the average user will ever make it to a +Local page.
So, while the accuracy and completeness of a +Local page still play a vital role in rankings, the goal of getting traffic to the page may not be viable, given the fact that the pages have, in essence, been somewhat buried by Google. An odd turn of events, but there it is. It could change tomorrow.
In sum, your friends should formulate a more appropriate goal, such as earning higher rankings, achieving broader brand visibility, generating better conversions, etc.
I hope this is helpful! -
-
You can go through the listing and check if the description is optimized for the keyword terms that they are going after. Make sure the categories are correct. Make sure to get 100% score for the listings. Carefully take note of the Name, Address, Phone, Email, and Website and create listings in Yelp, Bing Local, Citysearch, Yahoo Local, Foursquare, and other relevant directories. For Google Local rankings having your NAP (Name, Address, Phone, Email, and Website) across the board will really help with ranking. And just like you have done with Google Places encourage reviews to Yelp, Citysearch, Bing, and Yahoo, etc.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
How the hell do you get microformat to show up on google serp?
Preface: I implemented Microformat aggregate review (http://data-vocabulary.org/Review-aggregate) for our e-commerce website and included only on the homepage. The vote and count are actually coming from real reviews we are getting from our customers, and in the homepage some reviews are shown prominently and a link points to the full list of all the reviews. Microformat markup is correct, validated in GWT. Have been online for a while (probably a couple of years). Our website: http://www.gomme-auto.it The star rating never showed up. When checking competitors I could see their microformats where not showing up either. But now things changed, if I check one competitor (the market leader www.gommadiretto.it) searching for it with their brand name “gommadiretto” no star rating is showing, but if I search for tires of a specific manufactured like “pneumatici barum” I can see their result in serp is showing the star rating for that specific internal page (the brand page) where they simply put the website overall aggregate review microformat mark up, they actually put it on every page. And that make me scratch my head and start asking myself some questions: is google showing their microformats because they manually awarded them somehow? no other competitor seems to have got the star rating in serp is google showing their microformats because they have so much more reviews than I have? I have around 1700, they have around 11000. is google showing their microformats because their reviews are certified by TrustPilot? is google showing their microformats because they put it in the product page? well of course since I am not putting it there (in the brand page) it's a factor, but isn't it recommended to put the website aggregate reviews microformat only on one page? and shouldn't we show the brand reviews on the brand page? isn't it best practice/recommended to put the website aggregate review microformat only on one page? is google showing their microformats because of some other reasons I can't see? What the hell is google criteria for showing the star rating? Does anyone know?
On-Page Optimization | | max.favilli0 -
On-Site Optimization Issue!
Hello, I have some confusion about how to structure my site to better in on-site optimization. I am using WordPress. Therefore, there are many things that I need to consider as following: Static Page for homepage OR Latest posts? Archive, Category, Author, Attachment and Tag pages - To put meta robots (no index, follow) or not to prevent duplication? If I use Static Page for homepage, do I need to add meta robots (no index, follow) to POINT 2 above or not? If I use Latest Posts for homepage, do I need to add meta robots (no index, follow) to POINT 2 above or not? To have breadcrumb or not? To have recent posts, comment, tag clouds or popular posts/comments widget or not? To have social sharing icons and related posts in single post or not? If you don't mind adding more tips that I don't know it would be very great! Thanks!
On-Page Optimization | | dinabrokoth0 -
Newbie question - Optimize homepage or create new page?
Hey everyone! I'm new here and wondering if anybody could help with a question. I'm trying to optimize for the keyword "meet people". I've attached a screenshot of the Moz SERP report for the top 10 results for this keyword for Google (UK). Unfortunately my domain doesn't make the top ten - we're currently raked 11th. My domain - tastebuds.fm - has a domain authority of 53 and the homepage has a page authority of 61 which is way way above the scores for the 1st placed result. What am I doing wrong? Should I focus on competing using a page that contains the keyword eg. http://tastebuds.fm/meet-people OR should I work on improving the 11th place of my homepage by tweaking the content? Bear in mind I already have an A grade for on-page optimization of the homepage for the keyword in question. If I decided to optimize a new page... how long would it take before I ranked well for the page if the only place it was linked was from the homepage? Also what should I put on this new page? Our homepage already converts well so should I duplicate it somehow? 4ENqCPz.png
On-Page Optimization | | AlexTP0 -
Same pdf attachments placed on more websites good for seo?
Hi I would like to ask if it is duplicate content if I place on my website article with pdf attachment-the same pdf attachment is placed on more websites. Worldwide company made article (short article and some photos) with special offer in pdf, this pdf file is placed on more sites. How google handle with this file? Can google read text inside?
On-Page Optimization | | joeko0 -
How do websites display product attributes listed with their meta descriptions in Google SEPRs?
If you take a look at this SERP for "boys costumes" you can see that Amazon, HalloweenExpress and Target all have attributes listed such as "Products 1-25 of 500" or Kids Legolas _Costume. _ These are getting blended with their meta descriptions. How are they doing this? Anyone see any lifts in ranking or CTR by doing this? Thank you!
On-Page Optimization | | Troyville0 -
Does Google respect User-agent rules in robots.txt?
We want to use an inline linking tool (LinkSmart) to cross link between a few key content types on our online news site. LinkSmart uses a bot to establish the linking. The issue: There are millions of pages on our site that we don't want LinkSmart to spider and process for cross linking. LinkSmart suggested setting a noindex tag on the pages we don't want them to process, and that we target the rule to their specific user agent. I have concerns. We don't want to inadvertently block search engine access to those millions of pages. I've seen googlebot ignore nofollow rules set at the page level. Does it ever arbitrarily obey rules that it's been directed to ignore? Can you quantify the level of risk in setting user-agent-specific nofollow tags on pages we want search engines to crawl, but that we want LinkSmart to ignore?
On-Page Optimization | | lzhao0 -
Google and display:none
Hi Guys, i want to know what you think about solution which i have to switch content in tab on my page. Here: http://www.exprestlac.sk/beta/produkt/vizitky i have some important content in tabs, which are switching via javascript. So when you click there on O produkte next to Ceny it will show you product description. My problem is that in source code when page is loaded i have this: Product description.. And after user click on O produkte javascript remove that display:none and show content. But Google will see only display:none as i think. Can i get penalty from Google? Will it index this text? Thanks for your suggestions how to resolve this.
On-Page Optimization | | xman870