Google Places
-
My client offers training from many locations within the UK.
These locations/venues are not owned by them, however I see no problem in setting up a different listing for each location in Google Places.
At the end of the day if a user searched for “Training London” they are looking for somewhere that they can book a course that would be in their local area. As my client has a “venue” there I think there is a good argument to say that your listing would be valid.
What are your thoughts.
-
The fact they don't "own" the location doesn't matter. Many small businesses don't "own" the locations, they are leased. I'll bet the client in this case leases space to hold their training classes. It would be appropriate to to have a places listing for each location. In the addresses they can just create arbitrary suite numbers to indicate that they may not be the ONLY business in that "place."
-
Nice trick
-
This is something that interests me as well. One of my sites has a very similar setup to you, and I ahve considered doing the same (submitting all of the venues to Google Places with the comapny name and h/o phone number)
I have refrained from doing this so far though, and my reasoning is as follows. If the venue (in your case training location) is already registered will Google mind? Can you have multiple business registered at one address?
The second reason I've not done is that it feels a little spammy. The business doesn't necessarily own the venues (training locations) so why should you be listed for them?
I wonder how this works for serviced/shared offices?
-
They would use a Head Office telephone number, same for each listing.
I have seen other companies with multiple listing with the same telephone number, so I am presuming that Google alllow this.
-
Does your client have a specific phone number for each of this places ? If not, I'm not sure if you can register a place for each of their "venue".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Google Manual Penalty - Dilemma?
Hi Guys, A while back, my company had a 'partial match' manual penalty from google for 'unnatural links' pointing to our site. This glorious feat was accomplished by our previous SEO agency for quite heavily spamming links (directories, all kinds of low quality sites). That being said, when the penalty hit we really didnt see any drop in traffic. In fact, it was not long after the penalty that we launched a new website and since our traffic has grown quite significantly. we've doubled our total visits from prior penalty to now. This previous SEO also did submit a couple of reconsideration requests (both done loosely as to fool Google by only removing a small amount of links, then abit more the next time when it failed - this was obviously never going to work). Since then, I myself have submitted a reconsideration request which was very thorough, disavowing 85 Domains (every single one at domain level rather than the individual URLs as I didnt want to take any chances), as well as getting a fair few links removed from when the webmaster responded. I documented this all and made multiple contacts to the webmasters so i could show this to Google. This reconsideration request was not successful - Google made some new backlinks magically appear that i had not seen previously. But really, my main point is; am I going to do more damage removing more and more links in order to remove the penalty, because as it stands we haven't actually noticed any negative effects from the penalty! Perhaps the negative effects have not been noticed due to the fact that not long after the penalty, we did get a new site which was much improved and therefore would naturally get much more traffic than the old site, but overall it has not been majorly noticed. What do you guys think - is it worth risking drop in rankings to remove the penalty so we don't face any future issues, or should I not go too heavy with the link removal in order to preserve current rankings? (im really interested to see peoples views on this, so please leave a comment if you can help!)
White Hat / Black Hat SEO | | Sandicliffe0 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Has google done well with these search results?
I am struggling to grasp the new logic behind google, my understanding was that they wanted to return more related searches so that the search matched the results giving people exactly what they are looking for from trusted suppliers. However I work in the vacation rental niche and I have found that the individual long tail searches have started to become less valuable as they are no longer giving the exact property. Here is a screenshot of the top 10 results for the key phrase "10 bedroom villas in quinta do lago" Position 1 & 2 are good results and would be expected however the next 7 positions are completely not related to the search, yes it is quinta do lago. But I am looking specifically for a 10 bedroom villa, none of these pages offer 10 bedroom villas. I actually found my listing outside the top 20 and mine is a 10 bedroom villa in quinta do lago. Does anyone have anything that can enlighten me on this? Thanks Andy 0bqdRJi
White Hat / Black Hat SEO | | iprosoftware0 -
Oh sh@t Wetherby Racecourse has been de indexed by Google :-(
Dio mio! Wetherby racecourse <cite>www.wetherbyracing.co.uk/</cite> has been de indexed by Google, re indexing request has been made via webmaster tools and the offending 3rd party banner ad has been stripped out. So my question is please. How long will it take approximately to re -index?
White Hat / Black Hat SEO | | Nightwing
And is it true re submitting an updated xml site & firing tweets at the ailing site may spark it back into life? Grazie tanto,David0 -
How to transform an excel file on a txt file to send the Google Dissavow
I have a disallow file made on excel with lots of columns off information. I want to transform to txt file saving it from excel, but the result file seems understandable Can someone helpme on how to transform an excel file on the Google Dissavow file format for the final import
White Hat / Black Hat SEO | | maestrosonrisas0 -
Someone COPIED my entire site on Google- what should I do?
I purchased a very high ranked and old site a year or so ago. Now it appears that the people I purchased from completely copied the site all graphics and content. They have now built that site up high in rankings and I dont want it to compromise my site. These sites look like mirror images of each other What can I do?
White Hat / Black Hat SEO | | TBKO0