What are the biggest optimization factors for Google Places?
-
I know some of the basic factors to rank better on Google Places, but I'm looking to see where the priority is and if there are negative factors?
-
Here's probably the best information you're going to find on this:
http://www.seomoz.org/blog/google-places-seo-lessons-learned-from-rank-correlation-dataWas this article helpful?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to eliminate ghost traffic from Google Analytics?
Hey Mozzers, I just wanted to see how you all deal with eliminating Google ghost traffic sources from Google. I tried setting up a RegEx 'include' list before, but it seemed as though I was blocking potential traffic sources when I did as much (I'm probably missing something here). Anyway, I'm interested to read how you all have dealt with this issue in the past, thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
Google Panda and Penguin "Recovery"
We're working with a client who had been hit by Google Panda (duplicate content, copyright infringement) and Google Penguin (poor backlinks). While this has taken a lot of time, effort and patience to eradicate these issues, it's still been more than 6 months without any improvement. Have you experienced longer recovery periods? I've seen sites perform every black hat technique under the sun and still nearly 2 years later..no recovery! In addition many companies I've spoken to advised their clients to begin right from the very beginning with a new domain, site etc.
White Hat / Black Hat SEO | | GaryVictory0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
How do I know what links are bad enough for the Google disavow tool?
I am currently working for a client who's back link profile is questionable. The issue I am having is, does Google feel the same way about them as I do? We have no current warnings but have had one in the past for "unnatural inbound links". We removed the links that we felt were being referred to and have not received any further warnings, nor have we noticed any significant drop in traffic or rankings at any point. My concern is that if I work towards getting the more ominous looking links removed (directories, reciprocal links from irrelevant sites etc.), either manually or with the disavow tool, how can I be sure that I am not removing links that are in fact helping our campaign? Are we likely to suffer from the next Penguin update if we chose to proceed without moving the aforementioned links? or is Google only likely to target the serious black hat links (link farms etc.)? Any thoughts or experiences would be greatly appreciated.
White Hat / Black Hat SEO | | BallyhooLtd0 -
Can Using Google Analytics Make You More Prone to Deindexation?
Hi, I'm aggressively link building for my clients using blog posts and have come upon information that using Google Analytics (as well as GWT, etc.) may increase my chance of deindexation. Anyone have any thoughts on this topic? I'm considering using Piwik as an alternative if this is the case. Thanks for your thoughts, Donna
White Hat / Black Hat SEO | | WebMarketingHUB0 -
Has anyone seen this kind of google cache spam before?
Has anyone seen this kind of 'hack'? When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine. The site itself is www.istc.org.uk Looking in the source of the pages you can see the home pages contains: Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page). As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
White Hat / Black Hat SEO | | JaspalX0 -
Donations , ethics and google
Hi What are your thought about donating to get a link. Site is a forum related to developing and programming. It has a PA of 65 and DA of 87 which makes this pretty tempting to me, For my keywords and ranking I know this would mean alot. Is this okay to do, borderline/greyish or is this something i should refrain from doing? I havent been 'donating' earlier. Any thoughts are welcome.
White Hat / Black Hat SEO | | danlae0