Duplicate content?
-
I am not understanding this - I see a duplicate content warning. When I look into it I see these two urls:
http;//search-engine-upgrade.com
http;//search-engine-upgrade.com/default.asp
(NOT a blog)
-
Thanks to both you and Keri - Even though my hype barks loudly of SEO I am a web designer who has had great luck in organic SEO via good page construction and my copywriting skills. I don't really "tune" other's sites - I rebuild them, usually in hand coded classic asp. Ergo, I am not actually a classic SEO service like many here.
I am finding that I am facing my next tech upgrade here with off-page issues like this one.
I got rid of the htaccess and added the above code in the head. I have never used htaccess like this before and use a format tailored to the server this site resides on. adapting it may have caused the gaffe.
Again, many thanks.
-
When I go to http://search-engine-upgrade.com/default.asp it's not removing the default.asp or rewriting to add the www, just wanted to let you know as you might want to double check the htaccess.
-
I had the server admin do a 301 on www.search-engine-upgrade.com to the non www, but I have to admit, I had no idea I had to do similar with the default page....Jeeze!
-
Well shiver me timbers. I never figgered your homepage could compete with itself! I guess it's time for me to stop depending on my old-school wordsmithery (old school here only goes back to the mid 90's) and get a new pocket protector ,-]
Thanks for the help & prompt reply. I have already deployed the htaccess
-
a 301 permanent redirect (as ninjamarketer mentioned) will work. if that's too technical you can also take a look at adding a canonical link to your default.asp.
the canonical link will not remove the "default.asp", but it will communicate to google and the SE's that the correct url for this content is at http://www.search-engine-upgrade.com
in the header put:
< link rel="canonical" href="http://search-engine-upgrade.com"/>
here is google's page on this.
http://www.google.com/support/webmasters/bin/answer.py?answer=139394
you may also want to check your server settings. sometimes they will use a default homepage or configure the 301 for you.
:>) good luck!
dan
-
I checked the header for both URLs are they are indeed giving me a 200 ok header code which means this is a page canonicalization issue.
Page canonicalization happens in cases where www or non www, .php or non .php , .asp or non .asp pages exist.
The solution to this is to do a 301 redirect of http;//search-engine-upgrade.com/default.asp to
http;//search-engine-upgrade.com using htaccess file.
Please this line of code in your htaccess file
redirect 301 /default.asp http://search-engine-upgrade.com
or you can the following rewrite code
RewriteEngine On
RewriteBase /
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /default.asp\ HTTP/
RewriteRule ^default.asp$ http://www.search-engine-upgrade.com/ [R=301,L]
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool to identify duplicated content on other sites
Hi does anyone know of a tool that could be used to identify if a site is using our content without permission? Thanks
Content Development | | turismodevino10 -
Is there a way to automate finding low quality content on your site?
Hi all, I have a site that was once #1 for many keywords. Fast forward a number of years and I am sinking lower and lower and it really started to sink low from September 2012 and appears to be due to an algo update. How should I go about finding my low quality pages? I am told that I might have pages bringing my entire site down. I deleted heaps of low quality pages but not seeing any improvements (might be a little impatient). Any tips for finding bad content?
Content Development | | BatmanGoonie0 -
Duplicate content penalty
Hi there, I'd like to ensure I avoid a duplicate content penalty and could do with some advice. There is a popular blogger in my industry. I have agreed to add his blog to my website. He currently posts his blog on one of the popular free blogger platforms, and will continue to do this. The issue is that I will be posting duplicate content onto my site and I want to ensure that I do not trigger a google penalty. Is there a simple way form me to inform Google of the original source of the content. My intitial thoughts are: 1. Add a noindex to the Robots.txt file 2. Add a link at the beginning of the article pointing to the original source 3. Adding a rel=canonical tag in the header of each blog entry pointing to the original blog post which resides on a completely different domain. Thanks DBC
Content Development | | DBC011 -
Duplicate external links?
I have been guest posting at a variety of reputable blogs in my niche. I generally write once or twice a month and have a bio link with a link to my blog. I'm wondering if multiple links from the same domain (but different pages) helps, or if there are some diminishing returns here. Should I only be writing one post for them? Of course, there are other non-SEO benefits too, because these are reputable sites. But I'm wondering how this helps my SEO? Thanks in advance!
Content Development | | JodiFTM0 -
Using syndicated content / videos
Hi all, We are looking at using some syndicated video content on our main category pages. We are an pharmacy and offer prescription medicines (Yes we are NHS registered and no we don't sell generic products). I want to use the videos to increase the UX for the category pages and increase stickiness......however i am a little worried about how Google will see this. We don't have the time or budget to create our own videos but the rest of the content around them is all ours. The videos are provided by the NHS so they are good quality content and should add value to the consumer. Will big G stiff us for doing this?
Content Development | | nicc19760 -
Prevent average users from copying content and pasting into their websites
Please do not respond with a "you can't stop them" comment, I understand this. Most of our pages have content that is duplicated across multiple domains. The recent Google algorithm update focused on penalizing pages that have duplicate content, and it could be one of the reasons that we have been seeing traffic loss. I'm looking for some type of javascsript/php code that will help minimize this issue.If could be code that does not allow you to copy and paste the code without turning of javascript or a dialog box pops up and says "this content is copyright protected, anyone copying this content is subject to legal action" I've found one script that might work http://www.ioncube.com/html_encoder.php My questions are still the same: 1 What is the best method to achieve my objective? 2 Will this additional code affect how the webbots see our site and or affect rankings? I know that anyone can figure out how to get the code, I am trying to mitigate by providing a warning about copyright infringement and making it more challenging to copy our content. Please do not respond with a "you can't stop them" comment, etc, I understand this. Thank you for your comments!
Content Development | | 4RealLocal0 -
Does the duplicate content on the crawl errors report test content on external websites?
Hello, Can you tell me if this is just duplicate content within my site or if it also recognises duplicate content on external sites as well? Thanks
Content Development | | stuarta600 -
Archive older, low ranked content to help new content in Panda 2.2?
After watching the white board friday re: Panda 2.2, it got me to thinking about old content. One of the sites that I work with generates 3-10 new articles/day (movie reviews, interviews, guides, event previews, etc) and has been doing so since 2005. Now, they have almost 10k articles, 7k of which are indexed. The quality of the content varies, and much of it is dated (movies, events) much of the amount of older content gets 0-5 pageviews/month, made in the days BEFORE the site was using Google News + social tools to spread the word (and backlinks). Note that those older articles also of course tend to have 100% bounce, and small/zero TOS. Is this hurting the site? With 75-100 articles/month being published, I want to make sure they get maximum exposure. I'm also concerned that crawlers get sucked into the site chasing down old BS content, and that is hurting it as well. What to do with this content? Should I unpublish unpopular, dated content and get it off the internet? Or, do I leave it on, but NOINDEX it so Google won't crawl it?
Content Development | | EricPacifico0