Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does posting an article on multiple sites hurt seo?
-
A client of mine creates thought leadership articles and pitches multiple sites to host the article on their site to reach different audiences.
The sites that pick it up are places such as AdAge and MarketingProfs and we do get link juice from these sources most of the time.
Does having the same article on these sites as well as your own hurt your SEO efforts in any way? Could it be recognized as duplicate content?
I know the links are great just wondering if there is any other side effects especially when there are no links provided!
Thank you!
-
It depends. If the article goes on your site first it gets indexed and all the credit. If someone takes it for their own usage and does not link back to you it can hurt them. If they syndicate your article and trackback to the original, AKA the first one indexed they will not be punished.
-
There is a larger issue at play here.
Submitting the same article to multiple outlets is a sure way of pissing off editors and destroying relationships. It could be seen as less than exemplary conduct. I speak as a former editor.
If your client is a thought leader, the best bet is to submit one article to one outlet. Which is not to say you can't write another article for another publication that is a variation on the theme.
I work with thought leaders in several fields. Guest blogging is a hugely effective technique. The outlets are thrilled to get a free article from a leading expert that is far more authoritative than what they usually publish.
But you must insist on a link back or there is no SEO benefit. (There may be a marketing and branding benefit.) Often the link back can be done in the author's note. Even better is getting it in the text in a natural way. And you have to be relentless is ensuring the links actually appear. Not infrequently, you have to follow up post-publication.
My strategy is to time the guest blogging activity to coincide with the release of research or an e-book. We target 5-7 leading publications. Each gets an original and unique article that focusses on one aspect of the material. The articles on the third-party sites point back to the full version on our own site.
Just to be clear: we're not talking about cutting and pasting. We're talking about an original article customized to the third-party site and its audience that may have go through several drafts.
It's quite a bit of work, but it pays off. Big time.
These days, I call myself a web strategist. But sometimes I also act as a content strategist. I really think this is the future of our industry, post Panda and Penguin.
-
If the text is exactly the same in each article then yes, Google looks for large chunks of duplicate text. Usually the way to do this would be to rewrite the article for each site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
What is SEO best practice to implement a site logo as an SVG?
What is SEO best practice to implement a site logo as an SVG?
Technical SEO | | twisme
Since it is possible to implement a description for SVGs it seems that it would be possible to use that for the site name. <desc>sitename</desc>
{{ STUFF }} There is also a title tag for SVGs. I’ve read in a thread from 2015 that sometimes it gets confused with the title tag in the header (at least by Moz crawler) which might cause trouble. What is state of the art here? Any experiences and/or case studies with using either method? <title>sitename</title>
{{ STUFF }} However, to me it seems either way that best practice in terms of search engines being able to crawl is to load the SVG and implement a proper alt tag: What is your opinion about this? Thanks in advance.1 -
Google displays multiple titles for same article. What does this mean?
I've linked to some screenshots so that it what I'm talking about makes more sense. Sometimes, when I perform a search, I see an article with the correct article title listed as the page title in the SERPs. Other times, I see the wrong page title – it's a generic somethin' or other done by my client's web design company with a bunch of keywords thrown in. The latter (not the correct article title) also appears at the top of the browser tab for every article on my client's site. I know this is bad, but what can be done about it? This would never happen if my client used Wordpress or some easily modifiable CMS, but they're using a proprietary one maintained by the group that designed the website. open?id=0BxB_dYL1ylGgVVF1dHlwdXp2dFU open?id=0BxB_dYL1ylGgdWJjdlJoRlRIR00
Technical SEO | | Greenery0 -
One robots.txt file for multiple sites?
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site. Thanks for the help, Rena
Technical SEO | | renalynd270 -
International Seo - Canada
Our organization is currently only operating in the USA but will soon be entering the Canadian market. We did a lot of research and decided that for our needs it would be best to use a subfolder for Canada. Initially we will be targeting the english speaking community but eventually we will want to expand to the french speaking Canadians as well. The question is - is there a preferred version in setting up the subfolders: www.website.org/ca/ -- default will be english www.website.org/ca/fr/ - french www.website.org/en-ca/ - english www.website.org/fr-ca/ - french www.website.org/ca/en/ -english www.website.org/ca/fr/ - french Thanks
Technical SEO | | Morris770 -
What is the best way to find missing alt tags on my site (site wide - not page by page)?
I am looking to find all the missing alt tags on my site at once. I have a FF extension that use to do it page by page, but my site is huge and that will take forever. Thanks!!
Technical SEO | | franchisesolutions1 -
Multiple urls for posting multiple classified ads
Want to optimize referral traffic while at same time keep search engines happy and the ads posted. Have a client who advertises on several classified ad sites around the globe. Which is better (post Panda), having multiple identical urls using canonicals to redirect juice to original url? For example: www.bluewidgets.com is the original www.bluewidgetsusa.com www.blue-widgets-galore.com Or, should the duplicate pages be directed to original using a 301? Currently using duplicate urls. Am currently not using "nofollow" tags on those pages.
Technical SEO | | AllIsWell0 -
How should I structure a site with multiple addresses to optimize for local search??
Here's the setup: We have a website, www.laptopmd.com, and we're ranking quite well in our geographic target area. The site is chock-full of local keywords, has the address properly marked up, html5 and schema.org compliant, near the top of the page, etc. It's all working quite well, but we're looking to expand to two more locations, and we're terrified that adding more addresses and playing with our current set-up will wreak havoc with our local search results, which we quite frankly currently rock. My question is 1)when it comes time to doing sub-pages for the new locations, should we strip the location information from the main site and put up local pages for each location in subfolders? 1a) should we use subdomains instead of subfolders to keep Google from becoming confused? Should we consider simply starting identically branded pages for the individual locations and hope that exact-match location-based urls will make up for the hit for duplicate content and will overcome the difficulty of building a brand from multiple pages? I've tried to look for examples of businesses that have tried to do what we're doing, but all the advice has been about organic search, which i already have the answer to. I haven't been able to really find a good example of a small business with multiple locations AND good rankings for each location. Should this serve as a warning to me?
Technical SEO | | LMDNYC0