Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
One Business-Multiple Services
-
Hello Everyone,
I was looking for some strategies for doing SEO on a site that offers multiple services.
Here is the example:
There is one company with ONE physical address.
They perform the following services:
- Pest Control
- Mold Remediation
- Home Inspections
- Waterproofing
They also handle these services in several surronding cities.
They want to maintain one website for branding purposes.
Obviously I will create individual pages on their site for each service but was wondering how diffiuclut it will be to rank one website for these various services.
Thank you!
-
Hello Bill,
Thanks for coming to Q&A with your question. The NAP is really the key, more so than the website. For the business to be able to treat each specialty as distinct, it would need to become 4 distinct companies, each with a unique legal business name, legit physical street address and local area code phone number. This scenario would enable the owner to have a unique Google Place Page for each of the businesses, instead of just one Place Page for all of his specialties (as well as having unique listings in all of the other local business indexes). As things currently are, he is permitted to have only the one listing per index.
This is the case for most businesses like that of your client and by building out his content on his website, you are doing pretty much what you can do for his organic campaign (plus linkbuilding, social media, video etc., of course).
The tough thing about clients like this one, is that they typically not only offer a menu of very varied services, but they also tend to serve in a number of surrounding cities. So an SEO/Local SEO campaign typically looks something like this:
1. Get the client listed in the major local indexes.
2. Campaign for reviews in a variety of sources.
3. Get citations for his Google Place Page
4. Build out a body of service-related content on the website.
5. Build out a body of geographic content on his website.
6. Build links every which way
7. Engage in additional forms of marketing that will be most effective at reaching the client's audience (email, video, social media, blogging, etc.)
Now, in entering into all of this work, the client must be informed up front that his chances of ranking above the fold of Google's results are mostly going to revolve around his services in his city of location, in that he may achieve grey pinned local results for these 'service + geo' terms. He may not be able to expect top rankings for all 4 services. In any service city where he isn't physically located, the client should be made to understand that he is most likely to have to rely solely on the organic rankings below the local results, as Google will be viewing his competitors with physical locations in those cities as most relevant.
Clients like these are more complicated than, for example, a dentist with an office in Denver. But, that being said, there are substantial benefits to engaging in the work. Even lower rankings for terms can lead to trickles of monthly traffic and if these convert to phone calls and bookings, it has all been worth it.
Good luck!
-
Yes but then its hard to get the quality links for each, you can do your local directories for each, but the quality links is a bit harder.
-
Thanks Alan. I can understand why they want to do this from a branding standpoint but it will be harder to rank for individual terms.
In most cases I would think multiple websites would be called for here. A website for each area of service.
-
It is hard to rank for multiple servies, but even harder for multi locations, but you seem to be doing the write thing, make a page for each target.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Schema handle two sets of business hours?
I have a client who, due to covid, will have two sets of business hours. Morning hours for business customers, and afternoon hours for general customers. Is it possible to designate this distinction in schema?
Intermediate & Advanced SEO | | bherman0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
Page title and slug as complements to one another?
When creating a page, is it ever worthwhile to ensure that there's minimal duplication in the keywords in the page title vs. the slug? Or is it more like the title is more like a sentence description of the page and the slug is a scannable set of keywords that describes the page, and duplication doesn't really matter.
Intermediate & Advanced SEO | | TheaterMania0 -
What Wordpress Update Services Should You Be Using on Your Wordpress Blog?
I have been told that pingomatic.com is all that you need however yesterday I went to a conference and others were recommending to have a good list of pinging services to cover all your bases Here are 4 that have been recommended: pingomatic technorati blogsearch.google.com feedburner Any others that should be included on this list? My goal is not to spam these ping lists however want to make sure my content is getting indexed quickly
Intermediate & Advanced SEO | | webestate0 -
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not. Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is: http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be: http://www.example.com/main-keyword/vintage/blue/spring/ I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment. So I guess the questions are: Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep? Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
Intermediate & Advanced SEO | | boxcarpress0 -
Magento: URLs for Products in Multiple Categories
I am working in Magento to build out a large e-commerce site with several thousand products. It's a great platform, but I have run into the issue of what it does to URLs when you put a product into multiple categories. Basically, "a book" in two categories would make two URLs for one product: 1) /books/a-book 2) author-name/a-book So, I need to come up with a solution for this. It seems I have two options: Found this from a Magento SEO article: 'Magento gives you the ability to add the name of categories to path for product URL's. Because Magento doesn't support this functionality very well - it creates duplicate content issues - it is a very good idea to disable this. To do this, go to System => Configuration => Catalog => Search Engine Optimization and set "Use categories path for product URL's to "no".' This would solve the issues and be a quick fix, but I think it's a double edged sword, because then we lose the SEO value of our well named categories being in the URL. Use Canonical tags. To be fair, I'm not even sure this is possible. Even though it is creating different URLs and, thus, poses a risk of "duplicate content" being crawled, there really is only one page on the admin side. So, I can't go to all of the "duplicate" pages and put a canonical tag, because those duplicate pages don't really exist on the back-end. Does that make sense? After typing this out, it seems like the best thing to do probably will be to just turn off categories in the URL from the admin side. However, I'd still love any input from the community on this. Thanks!
Intermediate & Advanced SEO | | Marketing.SCG0 -
Multiple IPs (load balancing) for same domain
Hello, I'm considering moving our main website to a multiple servers, perhaps in multiple different datacenters and use a DNS round robin load balancing by assigning it 4 different IP addresses (probably from 4 different C classes). example:
Intermediate & Advanced SEO | | maddogx
ourdomain.com A 1.1.1.1
ourdomain.com A 2.2.2.2
ourdomain.com A 3.3.3.3
ourdomain.com A 4.4.4.4 Every time you ping the domain you will get a response from another IP of the group. Therefore search engines will see a different IP each time they scan the site. We have used the main IP for our website for past 6 years without changing it. We have a quite good SEO in our niche which I don't want to loose of course. My question is, will adding more IPs to the domain affect any how on the ranking ? What is the suggested way to do it anyway? What is recommended to do before and after? Thanks for you attention and help in advance. Dmitry S.0 -
Submitting URLs multiple times in different sitemaps
We have a very dynamic site, with a large number of pages. We use a sitemap index file, that points to several smaller sitemap files. The question is: Would there be any issue if we include the same URL in multiple sitemap files? Scenario: URL1 appears on sitemap1. 2 weeks later, the page at URL1 changes and we'd like to update it on a sitemap. Would it be acceptable to add URL1 as an entry in sitemap2? Would there be any issues with the same URL appearing multiple times? Thanks.
Intermediate & Advanced SEO | | msquare0