"Hot Desk" type office space to establish addresses in multiple locations
-
Hello Mozzers,
I'm noticing increasing numbers of clients' competitors getting physical addresses and phone numbers in multiple locations, no doubt partly for SEO purposes. These are little more than ghost presences (in hot desk style office space) and the phone numbers are simply diverted.
Do such physical addresses put them at an SEO advantage (over and above those who don't have hot desk style space and location phone numbers). Or does Google weed out hot desk type office spaces where they can?
Your thoughts/experience would be very welcome!
Thanks in advance, Luke
-
You're very welcome, Luke!
-
Thanks Miriam - that's incredibly useful - much appreciated
-
Hey Don,
Thank you. Yes, this has become a pretty hot topic again since Google's revision of their guidelines. -
Thanks for the reply Miriam. I was interested to see where this topic went.
Nice links!
-
Hey Luke!
Good topic - here's the scoop:
-
Yes, if the owner gets away with it, this tactic could help them rank in additional geographies, BUT ...
-
It puts them at risk for Google take down of the listings in question, and could potentially influence Google's view of the whole business. Suggest you read:
http://blumenthals.com/blog/2014/12/11/google-some-days-i-just-shake-my-head-today-is-one-of-them/
If you believe some of your competitors are falsely claiming to have staffed, physical offices, you may report them as spam.
Hope this helps!
-
-
The advantage is solely for the local SEO benefits. Google would probably like to weed them out but as Satellite offices are actually used by businesses regularity it can be difficult. However I do not see it being much of a benefit unless you are a service area business like plumbing or general contractors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the "Homepage" for an International Website With Multiple Languages?
BACKGROUND: We are developing a new multi-language website that is going to have: 1. Multiple directories for various languages:
Intermediate & Advanced SEO | | mirabile
/en-us, /de, etc....
2. Hreflang tags
3. Universal footer links so user can select their preferred language.
and
4. Automatic JS detection of location on homepage only, so that when the user lands on /, it redirect them to the correct location. Currently, the auto JS detection only happens on /, and no other pages of the website. The user can also always choose to override the auto-detection on the homepage anytime, by using the language-selector links on the bottom. QUESTION: Should we try to place a 301 on / to point to en/us? Someone recommended this to us, but my thinking is "NO" - we do NOT want to 301 /. Instead, I feel like we should allow Google Access to /, because that is also the most authoritative page on the website and where all incoming links are pointing. In most cases, users / journalists / publications IMHO are just going to link to /, not dilly dally around with the language-directory. My hunch is just to keep / as is, but also work to help Google understand the relationship between all of the different language-specific directories. I know that Google officially doesn't advocate meta refresh redirects, but this only happens on homepage, and we likewise allow user to override this at any time (and again, universal footer links will point both search engines and users to all other locations.) Thoughts? Thanks for any tips/feedback!2 -
Dilemma about "images" folder in robots.txt
Hi, Hope you're doing well. I am sure, you guys must be aware that Google has updated their webmaster technical guidelines saying that users should allow access to their css files and java-scripts file if it's possible. Used to be that Google would render the web pages only text based. Now it claims that it can read the css and java-scripts. According to their own terms, not allowing access to the css files can result in sub-optimal rankings. "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."http://googlewebmastercentral.blogspot.com/2014/10/updating-our-technical-webmaster.htmlWe have allowed access to our CSS files. and Google bot, is seeing our webapges more like a normal user would do. (tested it in GWT)Anyhow, this is my dilemma. I am sure lot of other users might be facing the same situation. Like any other e commerce companies/websites.. we have lot of images. Used to be that our css files were inside our images folder, so I have allowed access to that. Here's the robots.txt --> http://www.modbargains.com/robots.txtRight now we are blocking images folder, as it is very huge, very heavy, and some of the images are very high res. The reason we are blocking that is because we feel that Google bot might spend almost all of its time trying to crawl that "images" folder only, that it might not have enough time to crawl other important pages. Not to mention, a very heavy server load on Google's and ours. we do have good high quality original pictures. We feel that we are losing potential rankings since we are blocking images. I was thinking to allow ONLY google-image bot, access to it. But I still feel that google might spend lot of time doing that. **I was wondering if Google makes a decision saying, hey let me spend 10 minutes for google image bot, and let me spend 20 minutes for google-mobile bot etc.. or something like that.. , or does it have separate "time spending" allocations for all of it's bot types. I want to unblock the images folder, for now only the google image bot, but at the same time, I fear that it might drastically hamper indexing of our important pages, as I mentioned before, because of having tons & tons of images, and Google spending enough time already just to crawl that folder.**Any advice? recommendations? suggestions? technical guidance? Plan of action? Pretty sure I answered my own question, but I need a confirmation from an Expert, if I am right, saying that allow only Google image access to my images folder. Sincerely,Shaleen Shah
Intermediate & Advanced SEO | | Modbargains1 -
Interlinking sites in multiple languages
I am working on a project where the client has a main .com site and the following additional sites which are all interlinked: .com site targeting US
Intermediate & Advanced SEO | | rachelmanning888
.com site targeting China
.HK site targeting Hong Kong All sites contain similar information (although the Chinese site is translated). They are not identical copies but being shopping sites, they contain a lot of similar product information. Webmeup software (now defunct) showed that the inbound links to the main site, from the additional domains are considered risky. Linkrisk shows them as neutral. The client wants them to be interlinked and would not want to remove the additional domains as they get a good amount of traffic. In addition, the messages and products for each country domain have been tailored to a degree to suit that audience. We can rewrite the content on the other domains, but obviously this is a big job. Can anyone advise if this would be causing a problem SEO wise and if so, is the best way to resolve it to rewrite the content on the US and Hong Kong sites? Alternatively would it be better to integrate the whole lot together (they will soon be rebuilding the main site, so it would be an appropriate time to do this).0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Manage Ranking for " Out of Stock" pages
Hi, I own an e-commerce marketplace where the products are sold by 3rd party sellers and purchased by end users. My problem is that whenever a new product is added the search engine crawls the website and it ranks the new page on 4th page. when I start optimizing it to gain better rankings in search engines the product goes out of stock and the rankings drop to below 100. To counter that I started showing other related products on the "Out of Stock" pages but even then the rankings are dropping. Can someone help me with this problem?
Intermediate & Advanced SEO | | RuchiPardal0 -
Multiple city network
Im currently setting up a large network and my original thought was to target keywords via the city and then setting up a website with the domain name being that keyword. Now im thinking that in the long run thats going to be a massive pain in my ass. Im thinking what i should do is something along these lines... "www.companyname.com/cityorkeywordhere" any thoughts? Thanks for the help
Intermediate & Advanced SEO | | dcstover10 -
Should "View All Products" be the canonical page?
We currently have "view 12" as the default setting when someone arrives to www.mysite.com/subcategory-page.aspx. We have been advised to change the default to "view all products" and make that the canonical page to ensure all of our products get indexed. My concern is that doing this will increase the page load time and possibly hurt rankings. Does it make sense to change all our our subcategory pages to show all the products when someone visits the page? Most sites seem to have a smaller number of products as the default.
Intermediate & Advanced SEO | | pbhatt0 -
Where does "Pages Similar" link text come from?
When I type in a competitor name (in this case "buycostumes") Google shows several related websites in it's "Pages Similar to..." section at the bottom of the page:
Intermediate & Advanced SEO | | costumeMy question, can anyone tell me where the text comes from that Google uses as the link. Our competitors have nice branded links and our is just a keyword. I can find nothing on-page that Google is using so it must be coming from someplace off-page, but where?
0