Duplicate Contact Information
-
My clients has had a website for many years, and his business for decades. He has always had a second website domain which is basically a shopping module for obtaining information, comparisons, and quotes for tires. This tire module had no informational pages or contact info. Until recently, we pulled this information in through iframes.
Now however the tire module is too complex and we do not bring in this info through iframes, and because of the way this module is configured (or website framework), we are told we can not place it as a sub-directory.
So now this tire module resides on another domain name (although similar to the client's "main site" domain name) with some duplicate informational pages (I am working through this with the client), but mainly I am concerned about the duplicate contact info -- address and phone.
Should I worry that this other tire website has duplicated the client's phone and address, same as their main website?
And would having a subdomain (tires.example.com) work better for Google and SEO considering the duplicate contact info?
Any help is much appreciated.
ccee bar
(And, too, The client is directing AdWords campaigns to this other website for tires, while under the same AdWords account directing other campaigns to their main site? - I have advised an entirely separate AdWords account for links to the tire domain. BTW the client does NOT have separate social media accounts for each site -- all social media efforts and links are for the main site.)
-
Laura,
Yes thank you for your reply, this helps greatly.
Right now for the client, because they lack a good strategy for organic SEO, AdWords generates their greatest traffic. I hope to leverage this with a better organic approach for SEO, and help create a better AdWords strategy.
But all that said, I just wasn't sure about the contact info and address... now I can move on. Thanks again!
-
First of all, backlinks from Adwords campaigns do not help you with organic search rankings at all.
Secondly, this kind of duplicate content issue may not be as big a problem as you think. If Google detects two pages have the same or very similar content, it will choose the best one to display in search results and filter the other one out. So, you may not need to do anything.
On the other hand, if you are particular about which website should appear in search results for that content, you'll want to use the rel="canonical" tag to let Google know which page you prefer. You'll find more info about the canonical tag at the two links below.
- https://support.google.com/webmasters/answer/139066?hl=en
- https://moz.com/learn/seo/canonicalization
I hope that helps!
-
Laura,
Thanks so much for your response.
I guess what I was thinking is if online directories have duplicate info that would be expected.
But if duplicate content information, and business name, were on two different websites ((each set up as a service or consulting business)), would it look like the two websites were trying to capitalize on search results -- especially if some outbound links (like AdWords) were coming to one site (tires, say) and also to the "main site" (brakes, and some tires).
Still you think this is OK?
ccee bar
-
Having the same phone number and address on two websites is not a duplicate content issue. It's very common because of business directories all over the web. If that's the only duplicate content you're worried about, then you're fine.
-
Subdomain is better than separate domain if you cannot have a subdirectory.
As for the duplicate content, regardless of a separate domain, subdomain, or subdirectory, I would canonical any of the duplicate pages to the authoritative content. If it's the main site, then I would canonical the other domain to it. Not sure of a reason why you would prefer the other domain to be the authoritative source, but if that is the case, then you would canonical the main site to the other domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are backlinks within duplicate content ignored or devalued?
From what I understand, Googles no longer has a "Duplicate Content Penalty" instead duplicate content simply isn't show in the search results. Does that mean that any links in the duplicate content are completely ignored, or devalued as far as the backlink profile of the site they are linking to? An example would be an article that might be published on two or three major industry websites. Are only the links from the first website GoogleBot discovers the article on counted or are all the links counted and you just won't see the article itself come up in search results for the second and third website?
Intermediate & Advanced SEO | | Consult19010 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Duplicate currency page variations?
Hi guys, I have duplicate category pages across a ecommerce site. http://s30.postimg.org/dk9avaij5/screenshot_160.jpg For the currency based pages i was wondering would it be best (or easier) to exclude them in the robots.txt or use a rel canonical? If using the robots.txt (would be much easier to implement then rel canonical) to exclude the currency versions from being indexed what would the correct exclusion be? Would it look something like: Disallow: */?currency/ Google is indexing the currency based pages also: http://s4.postimg.org/hjgggq1tp/screenshot_161.jpg Cheers,
Intermediate & Advanced SEO | | jayoliverwright
Chris0 -
Duplicate Content Dilemma for Category and Brand Pages
Hi, I have a online shop with categories such as: Trousers Shirts Shoes etc. But now I'm having a problem with further development.
Intermediate & Advanced SEO | | soralsokal
I'd like to introduce brand pages. In this case I would create new categories for Brand 1, Brand 2, etc... The text on categories and brand pages would be unique. But there will be an overlap in products. How do I deal with this from a duplicate content perspective? I'm appreciate your suggestions. Best, Robin0 -
Schema.org mark up to avoid duplicate issue?
Hey there, I was wondering, does product's mark-up help to avoid penalization due to duplicate content? Here is the example: one of my client doesn't supply unique content. Because the major part of the content is technical description of products made by a couple of manufactures, do you think it will help me to link the official manufacturer webpage in a schena.org product mark-up? I know this is the right procedure to add mark-ups, but as on the pages of my client an outbound-link will show up, so I want to tell him this will be the only way to have that duplicate content without incurring in penalisation. I'd like to give him more than one solution, as I'm pretty sure it will never supply us with unique content. Thanks Pierpaolo
Intermediate & Advanced SEO | | madcow780 -
Same content pages in different versions of Google - is it duplicate>
Here's my issue I have the same page twice for content but on different url for the country, for example: www.example.com/gb/page/ and www.example.com/us/page So one for USA and one for Great Britain. Or it could be a subdomain gb. or us. etc. Now is it duplicate content is US version indexes the page and UK indexes other page (same content different url), the UK search engine will only see the UK page and the US the us page, different urls but same content. Is this bad for the panda update? or does this get away with it? People suggest it is ok and good for localised search for an international website - im not so sure. Really appreciate advice.
Intermediate & Advanced SEO | | pauledwards0 -
Is there a way to contact Google besides the google product forum?
Our traffic from google has dropped more than 35% and continues to fall. We have been on this forum and google's webmaster forum trying to get help. We received great advice, have waited months, but instead of our traffic improving, it has worsened. We are being penalized by google for many keywords such as trophies, trophies and awards and countless others - we were on page one previously. We filed two reconsideration requests and were told both times that there were no manual penalties. Some of our pages continue to rank well, so it is not across the board (but all of our listings went down a bit). We have made countless changes (please see below). Our busy season was from March to May and we got clobbered. Google, as most people know, is a monopoly when it comes to traffic, so we are getting killed. At first we thought it was Penquin, but it looks like we started getting killed late last year. Lots of unusual things happened - we had a large spike in traffic for two days, then lost our branded keywords, then our main keywords. Our branded keywords came back pretty quickly, but nothing else did. We have received wonderful advice and made most of the changes. We are a very reputable company and have a feeling we are being penalized for something other than spamming. For example, we have a mobile site we added late last year and a wholesale system was added around the same time. Since the date does not coincide with Penquin, we think there is some major technical driver, but have no idea what to do at this point. The webmasters have all been helpful, but nothing is working. We are trying to find out what one does in a situation as we are trying to avoid closing our business. Thank you! Changes Made: 1. We had many crawl errors so we reduced them significantly 2. We had introduced a mobile website in January which we
Intermediate & Advanced SEO | | trophycentraltrophiesandawards
thought may have been the cause (splitting traffic, duplicate content, etc.),
so we had our mobile provider add the site to their robots.txt file. 3. We were told by a webmaster that their were too many
links from our search provider, so we have them put the search pages in a
robots.txt file. 4. We were told that we had too much duplicate content. This was / is true, as we have hundred of legitate products that are similar:
example trophies and certificates that are virtually the same but are
for different sports or have different colors and sizes. Still, we added more content and added no index tags to many products. We compared our % of dups to competitors and it is far less. 5. At the recommendation of another webmaster, we changed
many pages that might have been splitting traffic. 6. Another webmaster told us that too many people were
linking into our site with the same text, namely Trophy Central and that it
might have appeared we were trying to game the system somehow. We have never bought links and don't even have a webmaster although over the last 10 years have worked with programmers and seo companies (but we don't think any have done anything unusual). 7. At the suggestion of another webmaster, we have tried to
improve our link profile. For example,
we found Yahoo was not linking to our url. 8. We were told to setup a 404 page, so we did 9. We were told to ensure that all of the similar domains
were pointing to www.trophycentral.com/ so we setup redirects 10. We were told that a site that we have linking to us from too many places so we reduced it to 1. Our key pages have A rankings from SEOMOZ for the selected keywords. We have made countless other changes recommended by experts
but have seen no improvements (actually got worse). I am the
president of the company and have made most of the above recent changes myself. Our website is trophycentral.com0 -
Duplicate content ramifications for country TLDs
We have a .com site here in the US that is ranking well for targeted phrases. The client is expanding its sales force into India and South Africa. They want to duplicate the site entirely, twice. Once for each country. I'm not well-versed in international SEO. Will this cause a duplicate content filter? Would google.co.in and google.co.za look at google.com's index for duplication? Thanks. Long time lurker, first time question poster.
Intermediate & Advanced SEO | | Alter_Imaging0