Best practice?
-
Hi there,
I have recently written an article which I have posted on an online newspaper website. I want to use this article and put it on my blog also, the reason the article will be placed on my blog is to drive users from my email marketing activities.
Would it simply be best practice to disallow Google from crawling this page? or put a rel canonical on the article placed on my blog pointing to the article placed on the online newspaper website?
Thanks for any suggestions
-
Try this tag instead since you are going to use exactly the same content on your website –
_However, you are free to set Google Cross domain canonical. For more information - _
http://googlenewsblog.blogspot.co.uk/2010/11/credit-where-credit-is-due.html
-
Hi Jonathan,
Thanks for your feedback.
Even though the content is the same, some of the links are slightly different, would that be a problem?
Thanks
-
Hello Gary,
Personally, I would recommend you put a rel="canonical" tag on the article and maybe even put a small link back to the newspaper website if possible. Something that would say: "As seen on xyz.com".
This will help the rankings of the newspaper article as well as building trust (a newspaper published it).
I have done this on several blog articles and it went all for the best.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?
Intermediate & Advanced SEO | | MJTrevens0 -
Best server-side xml sitemap generator?
I have tried xml-sitemaps which tends to crash when spidering my site(s) and requires multiple manual resumes which aren't practical for our businesses. Please let me know if any other server-side generators that could be used on multiple enterprise-sized websites exist that could be a good fit. Image sitemaps would also be helpfu.l +++One with multiple starting URLs would help spidering/indexing the most important sections of our sites. Also, has anyone heard of or used Dyno Mapper? This also looks like a good solution for us, but was wondering if anyone has had any experience with this product.
Intermediate & Advanced SEO | | recbrands0 -
What would be best way to transition from mobile website to responsive
We have a mobile website (mobile.website.com) that mirror our desktop site (www.website.com) with +100 000 pages. We have an alternate tag on our desktop to our mobile site and a user agent detect that redirect mobile traffic to our mobile site Our mobile site is no index and has a canonical to our desktop. Everything works pretty well, the mobile website is not index and only show up in SERP when a user make a search from a mobile. Our main website is now responsive and we would like to kill our mobile site without compromising our traffic. We know that a slight speed change or content change can affect our traffic, what would be the best way to do that? Big bang: redirect all mobile URL to desktop, remove user agent detect and remove alternate tag on desktop Semi Big bang: remove user agent detect and remove alternate tag on desktop and see how the traffic react before redirecting Progressive: remove the user agent detect and the alternate tag on some section of the website to see how the traffic react Other ? Anyone has any experience with that? Thanks and let me know if anything is not clear.
Intermediate & Advanced SEO | | Digitics0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Best Practices
Okay this would be a piece of cake for most of you out there.. What are the best practices once you add a page or piece of content on your website with a new keyword that you have never used before but plan to use it with every relevant new page you add. How do you ensure that Google will crawl that page? Secondly, if you add the new keyword in the old pieces of content/ pages you have already published by editing the content to suit that keyword, how would you ensure that it gets crawled my Google. Thanks in advance
Intermediate & Advanced SEO | | LaythDajani0 -
Best practices on setting up multi country Magento store
We run Magento and we're in the process of redesigning our site. We want the site to have separate storefronts for different countries, however we won't have the site language translated initially. We're thinking we'll use the Magento multi-store feature and have sites like /fr, /de /en-us, /en-au, etc. Is the best practice to use hreflang and for the non-english stores which haven't yet been translated? For example set them as, for French users: Essentially saying, the page is aimed at French people, but is in English. The separate storefronts will have things like currency and tax localised to each country and will gradually be getting translated, especially the more generic stuff like "Add to Cart", "Checkout" etc. Or, should it be targeted at French language and country, despite not all being translated into French? Or is there a better way to do this?
Intermediate & Advanced SEO | | seanmccauley0 -
How do I best handle Duplicate Content on an IIS site using 301 redirects?
The crawl report for a site indicates the existence of both www and non-www content, which I am aware is duplicate. However, only the www pages are indexed**, which is throwing me off. There are not any 'no-index' tags on the non-www pages and nothing in robots.txt and I can't find a sitemap. I believe a 301 redirect from the non-www pages is what is in order. Is this accurate? I believe the site is built using asp.net on IIS as the pages end in .asp. (not very familiar to me) There are multiple versions of the homepage, including 'index.html' and 'default.asp.' Meta refresh tags are being used to point to 'default.asp'. What has been done: 1. I set the preferred domain to 'www' in Google's Webmaster Tools, as most links already point to www. 2. The Wordpress blog which sits in a /blog subdirectory has been set with rel="canonical" to point to the www version. What I have asked the programmer to do: 1. Add 301 redirects from the non-www pages to the www pages. 2. Set all versions of the homepage to redirect to www.site.org using 301 redirects as opposed to meta refresh tags. Have all bases been covered correctly? One more concern: I notice the canonical tags in the source code of the blog use a trailing slash - will this create a problem of inconsistency? (And why is rel="canonical" the standard for Wordpress SEO plugins while 301 redirects are preferred for SEO?) Thanks a million! **To clarify regarding the indexation of non-www pages: A search for 'site:site.org -inurl:www' returns only 7 pages without www which are all blog pages without content (Code 200, not 404 - maybe deleted or moved - which is perhaps another 301 redirect issue).
Intermediate & Advanced SEO | | kimmiedawn0 -
Best way to find all url parameters?
In reference to http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html, what is the best way to find all of the parameters that need to be addressed? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0