Best approach for a client with another site for the same company
-
I have a client who has an old website and company A handles the SEO campaign for this site.
My client wanted us to create a new website with unique content for the same company aiming to double his chances of ranking on the 1st of SERP's and eventually dominating it.
So we created the new site for him and handled it's SEO campaign. So far we are ranking decently on the search engines but we feel like we could do better. The site we are optimizing for him uses the same company, tracking number and a virtual address in the same city.
Do you think Google has a problem with this set up?
We have listed the new site in the citation directories but I'm worried that we are sending google mixed signals. The company has two listing on each directories, one for the old site and another for the new site.
Another thing, Google+ Local for the new site is created and verified but is not showing up in local pack.
What is the best way to approach this mess?
We are looking into ranking for both local & organic results. -
Hi Adam,
Good thinking. Hope it works out well. Situations like these can be layers deep and murky - tough to sort out. Wishing you the best!
-
Thank you Miriam.
You are right, I would need to have a heart to heart talk with my client to sort out these issue.
-
First, if the tracking number that you use for the citations vary from the number listed on the new website, the old website, and/or the old citations this is a problem. Trust me, I am currently handling a client who wanted to rank locally for a city that they did not have a business location at. Without consulting us, they set up an ad for that city with a tracking number BUT it was still associated with the same business name and address. I don’t know if you are aware but these different directories scrape information from each other and as a result new listings are created with inconsistent business information. It's not pretty and not only is this confusing to a potential client looking for your business but you have significantly decreased your chances at appearing in Google’s 7-pack.
Here are a few resources that I refer back to but I would start with Google Places Guidelines---
http://getlisted.org/resources/why-citations-are-important.aspx
http://www.davidmihm.com/local-search-ranking-factors.shtml
- Pay attention to questions 5 & 7
http://www.seomoz.org/blog/40-important-local-search-questions-answered
- This is a follow up to a mozinar that is totally worth watching too.
Hope this helps!
-
Hi Adam,
We need to pedal back here to this:
"The site we are optimizing for him uses the same company, tracking number and a virtual address in the same city. "
and this:
"We have listed the new site in the citation directories...Another thing, Google+ Local for the new site is created and verified but is not showing up in local pack."
You have some root issues going on here. Both virtual offices and call tracking numbers for Local businesses are taboo in Google's local products. For legitimate participation in Google+ Local, your client needs to have:
-
Face-to-face transactions with customers either at the place of business (like a restaurant) or at the customers' locations (like a plumber).
-
A unique, physical street address (not a virtual office, P.O. box or shared address)
-
A unique local area code phone number (not a toll free, call tracking or shared number).
If the client cannot meet all 3 of the above criteria, then he is not suitable for inclusion in Google's local products, and he is not appropriate for a local citation building campaign.
So, this is actually the issue that needs to be sorted out first. Whether your client's failure to show up in the local results is due to a penalty stemming from Google considering the listing to be spam or stems from other issues is sort of moot, here, because the business model you are describing does not sound truly local to me.
For further reading, I recommend that both you and the client study Google's Places Quality Guidelines:
http://support.google.com/places/bin/answer.py?hl=en&answer=107528
You will see precisely why the business model you are describing is problematic in the above guidelines.
Regarding the call tracking phone number element, read:
http://searchengineland.com/for-local-seo-lack-of-call-tracking-solution-spawns-cloaking-70198
Read that post and all of the links in it, as well, for full information on the history of issues surrounding call tracking numbers in the world of Local SEO.
My feeling is that a wrong estimation of this client's opportunities may have been made here and that Local SEO is being pursued in vain until he can meet those requirements. Having 2 websites now in existence for the client is only going to compound the issues. I never recommend double sites for local business owners, but where there is some reason why they feel they MUST have more than one website, I advise them to make sure that their NAP (name-address-phone) is only published on one of the websites. Everything hangs on NAP in Local and if you're telling Google that both www.johntheplumber.com and www.sandiegoplumber.com are located at 123 First St. San Diego, CA., this will confuse Google and potentially lead to duplicated listings and ranking drops. Total clarity and consistency of data are vital to any Local SEO campaign, but in this case, your first step is going to be to assess the client's actual business model and then determine whether they have a legitimate place in the local index or need to pursue purely organic SEO due to a lack of the elements essential to local inclusion.
Hope this helps!
-
-
Hi Amber,
We use the same address but different location which is a virtual address then a local tracking number for the citations.
Do you think Google could tell if it's a virtual address or not and if they could, is it going to have a negative effect in our rankings?
Also, what I meant by "local pack" is the local map listings that shows up together with the organic search.
As per Google maps, I searched my client's company name and we aren't showing up in there either.
-
Our company also have two websites, we launch the second website for different reasons than your client, We sell power tools and power tool parts in the second website we want focus more in parts.
The phone number and address it's the same in both sites. We also have many products that are available in both sites.
When we decided to launch the second website we thought that having the same address and phone number could be a problem for our rank. Now 4 months later I don't think it cause any problem. The first website that we had for years still ranking well and improving the rank, our new website is also doing well and progressing fast.
We tried to avoid duplicate content so the product descriptions, about us page, and blog entries are different for each website. Also we didn't include the new site in the local directories.
In my humble opinion as long you have different content in each website the phone number and address wont be a problem.
-
Hi Adam,
When you say you have listed the new site in the citation directories are you using the same business name, address and phone number? What is the difference in the business information on these listings?
From my understanding, if these citations have mixed information for the same business you have set yourself up for duplicate listings. Google uses the information from these sources to validate the business name, address, phone number and business details. If Google is seeing two different listings for the same company on these citation sources how will it know which one to trust enough to show in it's local/map results?
In the end I think your ranking power is going to be (or is being) divided greatly for Local.
Also, when you say the G+Local listing that you created for the new site is not showing up in the local pack- what are you considering the "local pack" or are you saying that the listing is not even appearing when you search for it in Maps?
Best,
Amber
-
Hi Adam,
This is my first time hearing about this approach. In my opinion, it seems like your client is trying to game the search engine by creating 2 sites and hoping one of them will rank better. I believe search engines will not like it since your client is trying to game the system. I don't think search engines will favor anyone gaming the system.
Why don't your client spend the time and money creating and ranking the new site on the old site?
They can use those time and money to build more quality links and producing more contents then they will definitely rank higher instead of start a new website and doing everything from scratch.
my 2 cents. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does this company rank this page?
If you Google the keyword "used iPhone 5," the URL ranking #1 in the SERP is the following: http://buy.gazelle.com/buy-used-iphone-5/ This page has zero content on it and a button on it that takes you to a category page with the iPhone 5 pre selected. My question is **how does this page outrank the sites products pages? **I ran a backlink analysis and don't see any links pointing to that URL. Also, **how does this site deal with its duplicate content issues? **If you look at the following URLs, you'll see a bunch of duplicate content in the "Key Features" section below the fold. http://buy.gazelle.com/buy/used/iphone-5-16gb-at-t http://buy.gazelle.com/buy/used/iphone-5-16gb-sprint If you think about it, this site will have different product pages for each variations of cellphone carrier and cellphone storage capacity. So for an iPhone 5, they will have 15 pages! Any insight into this would be much appreciated!
Intermediate & Advanced SEO | | Cody_West0 -
Following Penguin 2.0 hit in May, my site experienced another big drop on August 13th
Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May. This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame. On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google. At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction. So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added. As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached. In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs. So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process. I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design. I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated. My website is ConcertHotels.com. Many thanks Mike
Intermediate & Advanced SEO | | mjk260 -
What happens when I redirect an entire site to an established page on another site?
Hi There, I have a website which is dedicated to selling ONE product (in different forms) or my main brand site. It is branded similarly, targets similar keywords, and gets some traffic which convert to leads. Additionally, the auxiliary site has a Google Rank 2 in its own right. I am thinking of consolidating this "auxillary" site to the specific product page on my main site. The reason I am considering doing this is to give a "boost" to the main product page on our main site which has many core keywords sitting with SERP ranking of between 11-20 (so not in first 10) Because this auxiliary site it gets traffic and leads in its own right, I don't want this to be to the detriment of my leads overall. Question is - if I 301 redirect the entire domain from my auxillary site to the equivalent product on my main site am I likely to see a large "boost" to that product page? (i.e. will I likely see my ranking rise from 11 - 20 significantly)
Intermediate & Advanced SEO | | love-seo-goodness0 -
Site revamp for neglected site - modifying site structure, URLs and content - is there an optimal approach?
A site I'm involved with, www.organicguide.com, was at one stage (long ago) performing reasonably well in the search engines. It was ranking highly for several keywords. The site has been neglected for some considerable period of time. A new group of people are interested in revamping the site, updating content, removing some of the existing content, and generally refreshing the site entirely. In order to go forward with the site, significant changes need to be made. This will likely involve moving the entire site across to wordpress. The directory software (edirectory.com) currently being used has not been designed with SEO in mind and as a result numerous similar pages of directory listings (all with similar titles and descriptions) are in google's results, albeit with very weak PA. After reading many of the articles/blog posts here I realize that a significant revamp and some serious SEO work is needed. So, I've joined this community to learn from those more experienced. Apart from doing 301 redirects for pages that we need to retain, is there any optimal way of removing/repairing the current URL structure as the site gets updated? Also, is it better to make changes all at once or is an iterative approach preferred? Many thanks in advance for any responses/advice offered. Cheers MacRobbo
Intermediate & Advanced SEO | | macrobbo0 -
Where to find a trusted, SEO strategist for real estate company web site?
I addition to running a small web development company and thus having some background to SEO/inbound marketing, I also own a real estate web site at www.nhfinehomes.com. Last year, I had the site recoded with many SEO improvements in mind. The result was a significant boost in traffic/leads and more pages being indexed. One of the challenge with real estate sites is that all my competitors have the same content (house listings) via the IDX feed so the challenge is to try to shape the pages to be a little different. My question is where do I go from here? I'm not sure where to best focus my efforts for maximum improvement. My biggest competitor has tons more links, some of which are just not realistic for a small firm like mine. Is content creation the next step to push out on a Wordpress blog with better internal linking? In Google Analytics, I see tons of keyword phrases that I have low rankings for and wonder if some optimized content could help boost those? What I would like to find is a TRUSTED, proven consultant with PROVEN real estate SEO experience that could take a look at my site, my competition and honestly tell me what it would take to improve so I can better determine how big a nut this is to crack. I am okay with doing most of the work and have my programmer who can help with some of the technical. I just don't want to take stabs in the dark and waste resources on ineffective or worst, potentially damaging results. So anyone here know where I can find such a strategist? Link
Intermediate & Advanced SEO | | LinkMoser0 -
I currently have a client that has multiple domains for multiple brands that share the same IP Address. Will link juice be passed along to the different sites when they link to one another or will it simply be considered internal linking?
I have 7 brands that are owned by the same company, each with their own domain. The brands work together to form products that are then sold to the consumer although there is not a e-commerce aspect to any of the sites. I am looking to create a modified link wheel between the sites, but didn't know if my efforts would pay off due to the same IP Address for all the sites. Any insight on this would be greatly appreciated.
Intermediate & Advanced SEO | | HughesDigital0 -
My site links have gone from a mega site links to several small links under my SERP results in Google. Any ideas why?
A site I have currently had the mega site links on the SERP results. Recently they have updated the mega links to the smaller 4 inline links under my SERP result. Any idea what happened or how do I correct this?
Intermediate & Advanced SEO | | POSSIBLE0 -
What is the best tool to crawl a site with millions of pages?
I want to crawl a site that has so many pages that Xenu and Screaming Frog keep crashing at some point after 200,000 pages. What tools will allow me to crawl a site with millions of pages without crashing?
Intermediate & Advanced SEO | | iCrossing_UK0