Does having a few URLs pointing to another url via 301 "create" duplicate content?
-
Hello!
I have a few URLs all related to the same business sector.
Can I point them all at my home domain or should I point them to different relevant content within it?
Ioan
-
Loan,
It seems like you're asking two different questions here: Do multiple 301's pointing to a single page create duplicate content--and Is it better to point 301's to your home page or a more relevant internal page. Is that right? The answer to the first question is no, you don't have to worry about duplicate content being caused by your 301 redirects. As far as the second question, it's usually best to point a 301 redirect to a page on a site that is the most relevant to the one being redirected. However, if the page you're thinking about redirecting isn't getting any search traffic and/or doesn't have any external links gong to it, your redirect won't really have an impact on your SEO
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to unrank your content by following expert advice [rant]
Hi, As you can probably see from the title, a massive rant is coming up. I must admit I no longer understand SEO and I just wanted to see if you have any ideas what might be wrong. So, I read this blog post on MOZ https://moz.com/blog/influence-googles-ranking-factor - where the chap is improving ranking of content that is already ranking reasonably well. I've got two bits of news for you. The good news is - yes, you can change your articles' ranking in an afternoon. Bad news - your articles drop out of Top 100. I'll give you a bit more details hoping you can spot what's wrong. Disclaimer - I'm not calling out BS, I'm sure the blogger is a genuine person and he's probably has had success implementing this. The site is in a narrow but popular ecommerce niche where the Top 20 results are taken by various retailers who have simply copy/pasted product descriptions from the manufacturer's websites. The link profile strength is varied and I'm not making this up. The Top 20 sites range from DA:4 to DA:56. When I saw this I said to myself, it should be fairly easy to rank because surely the backlinks ranking factor weight is not as heavy in this niche as it is in other niches. My site is DA:18 which is much better than DA:4. So, even if I make my pages tiny tiny bit better than this DA:4 site, I should outrank it, right? Well, I managed to outrank it with really crap content. So, I got to rank two high-traffic keywords in #8 or #9 with very little effort. And I wish I stayed there because what followed just completely ruined my rankings. I won't repeat what was written in the blog. If you're interested, go and read it, but I used it as a blueprint and bingo, indeed Google changed my ranking in just a couple of hours. Wait, I lost more than 90 positions!!!! I'm now outside Top100. Now even irrelevant sites in Chinese and Russian are in front of me. They don't even sell the products. No, they're even in different niches altogether but they still outrank me. I now know exactly what Alice in Wonderland felt like. I want out please!!!!
Algorithm Updates | | GiantsCauseway0 -
Canonicals from sub-domain to main domain: How much content relevancy matters? Any back-links impact?
Hi Moz community, I have this different scenario of using canonicals to solve the duplicate content issue in our site. Our subdomain and main domain have similar landing pages of same topics with content relevancy about 50% to 70%. Both pages will be in SERP and confusing users; possibly search engine too. We would like solve this by using canonicals on subdomain pointing to main domain pages. Even our intention is to only to show main domain pages in SERP. I wonder how Google handles it? Will the canonicals will be respected with this content relevancy? What happens if they don't respect? Just ignore or penalise for trying to do this? Thanks
Algorithm Updates | | vtmoz0 -
What happens when most of the website visitors end up at an "noindex" log-in page?
Hi all, As most of the users are visiting our website for log-in, we are planning to deindex login page. As they cannpt find it on SERP, they gonna visit our website and login; I just wonder what happens when most of the visitors just end up at homepage by browsing into an "noindex" page. Obviously it increases bounce rate and exit rate as they just gonna disappear. Is this going to push down us in rankings? What are the other concerns to check about? Thanks
Algorithm Updates | | vtmoz0 -
Google creating it own content
I am based in Australia but a US founded search on 'sciatica' shows an awesome answer on the RHS of the SERP https://www.google.com/search?q=sciatica&oq=sciatica&aqs=chrome.0.69i59.3631j0j7&sourceid=chrome&ie=UTF-8 The download on sciatica is a pdf created by google. Firstly is this common in the US? secondly any inputs on where this is heading for rollout would be appreciated. Is google now creating its own content to publish?
Algorithm Updates | | ClaytonJ0 -
Duplicate content advice
Im looking for a little advice. My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages. Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
Algorithm Updates | | jonny512379
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!). We went through the Panda and Penguin updates unaffected (visitors actually went up!). So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do. However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?) What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this. I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity). I would love to hear opinions0 -
Regarding site url structure
OK so there are already some answers to questions similar to this but mine might be a little more specific. OK website is www.bestlifeint.com Most of our product pages are as such: http://www.bestlifeint.com/products-soy.html for instance. However I was trying to help the SEO for certain pages (namely two) with the URL's and had some success with another page our Soy Meal Replacement I changed the site URL of this page from www.bestlifeint.com/products-meal to www.bestlifeint.com/Soy-Amazing-Meal-Replacement-with-Omega-3s.html (notice I dropped the /product part of url and made it more seo friendly. The old page for this page was something like www.bestlifeint.com/products-meal The issue is that recently this new page and another page I have changed http://www.bestlifeint.com/Whey-Milk-Alternative.html I have dropped the "/product" on the URL even though they are both products. The new Meal Replacement page used to be ranked like 6th on google at the begining of the month and now is like 48th or something. The new "whey milk" page (http://www.bestlifeint.com/Whey-Milk-Alternative.html) is ranked like 45th or something for "Whey Milk" when the old page...."products/wheyrice.html" was ranked around 18th or so at the begining of the month. Have I hurt these two pages by not following www.bestlifeint.com/product.... site structure? And focusing more on the URL SEO? I have both NEW pages receiving all link juice inside web site so they are the new pages (can not go to old page) and recently seeing that google has pretty much dropped the old pages in search rankings I have deleted these two pages. Do i just need to just wait and see? According to my research we should rank much higher for "Whey Milk" we should be on the first page according to googles own statements of searchers finding good relevant material. Any advice moving forward? Thanks, Brian
Algorithm Updates | | SammisBest0 -
Google Dropped 3,000+ Pages due to 301 Moved !! Freaking Out !!
We may be the only people stupid enough to accidentally prevent the google bot from indexing our site. In our htaccess file someone recently wrote the following statement RewriteEngine On
Algorithm Updates | | David_C
RewriteCond %{HTTP_HOST} ^mysite.com$ [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301] Its almost funny because it was a rewrite that rewrites back to itself... We found in webmaster tools that the site was not able to be indexed by the google bot due to not detecting the robots.txt file. We didn't have one before as we didn't really have much that needed to be excluded. However we have added one now for kicks really. The robots.txt file though was never the problem with regard to the bot accessing the site. Rather it was the rewrite statement above that was blocking it. We tested the site not knowing what the deal was so we went under webmaster tools then health and then selected "Fetch as Google" to have the website. This was our way of manually requesting the site be re-indexed so we could see what was happening. After doing so we clicked on status and it provided the following: HTTP/1.1 301 Moved Permanently
Content-Length: 250
Content-Type: text/html
Location: http://www.mystie.com/
Server: Microsoft-IIS/7.5
MicrosoftOfficeWebServer: 5.0_Pub
MS-Author-Via: MS-FP/4.0
X-Powered-By: ASP.NET
Date: Wed, 22 Aug 2012 02:27:49 GMT
Connection: close <title>301 Moved Permanently</title> Moved Permanently The document has moved here. We changed the screwed up rewrite mistake in the htaccess file that found its way in there but now our issue is that all of our pages have been severely penalized with regard to where they are now ranking compared to just before the indecent. We are essentially freaking out because we don't know the real time consequences of this and if or how long it will take for the certain pages to regain their prior ranks. Typical pages when down anywhere between 9-40 positions on high volume search terms. So to say the least our company is already discussing the possibilities of fairly large layoffs based on what we anticipate with regard to the drop in traffic. This sucks because this is peoples lives but then again a business must make money and if you sell less you have to cut the overhead and the easiest one is payroll. I'm on a team with three other people that I work with to keep the SEO side up to snuff as much as we can and we sell high ticket items so the potential effects if Google doesn't restore matters could be significant. My question is what would you guys do? Is there any way we can contact Google about such a matter? If you can I've never seen such a thing. I'm sure the pages that are missing from the index now might make their way back in but what will there rank look like next time and with that type of rewrite has it permanently effected every page site wide, including those that are still in the index but severely effected by the index. Would love to see things bounce back quick but I don't know what to expect and neither do my counterparts. Thanks for any speculation, suggestions or insights of any kind!!!0 -
301 redirect question
So I have an employer who owns a retail site and his category URLs are horrible. So, I am suggesting to him to create a new page with a pretty URL and 301 redirect the old page to the new page. I am suggesting this to him, because this will help increase CTR for the targeted keyword & help him rank higher for the term. He is apprehensive about this cause he thinks this will cause him to drop in ranking. Does anybody know any resources or have any past experiences that will back up my suggestion or his for that matter?
Algorithm Updates | | Cyle0