SEOMoz Internal Dupe. Content & Possible Coding Issues
-
SEOmoz Community!
I have a relatively complicated SEO issue that has me pretty stumped...
First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point.
Now, the issues I am having:
- I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx
A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments).
I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages:
-
Car Financing How It Works
-
A Home Loan is Possible with Bad Credit
(Please let me know if you could use more examples)
At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords.
All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not!
I could really use some significant help here...
- Both of our sites have a number of access points:
http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such.
However, my sites can also be reached via the following:
http://www.federalautoloan.com
http://www.federalautoloan.com/default.aspx
http://www.federalmortgageservices.com
http://www.federalmortgageservics.com/default.aspx
Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags?
I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details.
Thank you all in advance for the help! I greatly appreciate it!
-
Hey Cyrus,
Thank you very much for the detailed response!
-
Hi Colt,
Looks to me like you're getting a duplicate content errors because your page templates are so large, they are tripping the SEOmoz duplicate content filter, which goes off if more than 95% of your code is similar between 2 pages.
For example, take a look at these 2 URLs.
http://www.federalautoloan.com/Why-Shopping-for-an-Auto-Loan-is-Good.aspx
http://www.federalautoloan.com/Regarding-Dealer-Financing.aspx
With the gazillions of links at the bottom of the two pages, the pages have 98% similar code. (You can check it out yourself with this duplicate content tool) The good news is the TEXT content similarity is less than 40%.
1. Google is more sophisticated than Moz, but it would be a good idea to remove some of those links and put them into categories. If you could get these 100 or so links down to 20, that would be closer to ideal
2. Just a recommendation > most of your text is in a scroll box. I'd reformat your page so that all the text was visible without the box. Not sure if this is hurting you or not, but it seems contrary to best user experience, so I'd be inclined to think Google doesn't look to favorably on it.
3. Noticed you blocked a lot of files in your robots.txt, including your css. Unless you have a very specific reason for keeping Google out of these files, I'd let them crawl as Google uses CSS to render your page to see what content is above and below the fold.
4. Best practices is to redirect non-www to www versions of your site (or vice versa). If you can't do this, a canonical tag will do just as well. But how about redirecting everything WITHOUT the /index.aspx? That would look cleaner in search results.
Hope this helps. Best of luck with your SEO!
-
Hello Colt.
You have the joys of a Microsoft webserver.
That means all of these URLs work - and many more:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CRedit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREdit.aspx
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-CREDit.aspx
It does exactly the same thing if you remove the www.
- duplicate content issues.
Also, if you do this:
http://www.federalmortgageservices.com/A-Home-Loan-Is-Possible-with-Bad-Credit.aspxZ
- you return a 200 response code and serve up the front page
I didn't find a way to make your server give me a 404
-
As an aside: please feel free to extend any other SEO suggestions you may have my way! I am doing my best to learn the SEO trade... and ANY advice is appreciated.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Would you consider this thin content?
Just wondering what the community thinks about the following URLS and whether they are essentially thin content that should be handled through a canonical, noindex or a parameter filtering system: https://www.adversetdisplay.co.uk/products/3x1-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x2-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x3-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x4-popup-exhibition-stand https://www.adversetdisplay.co.uk/products/3x5-popup-exhibition-stand
Intermediate & Advanced SEO | | ColinDocherty0 -
301 issues
Hi, I have this site: www.berenjifamilylaw.com. We did a 301 from the old site: www.bestfamilylawattorney.com to the one above. It's been several weeks now and Google has indexed the new site, but still pulls the old one on search terms like: Los Angeles divorce lawyer. I'm curious, does anyone have experience with this? How long does it take for Google to remove the old site and start serving the new one as a search result? Any ideas or tips would be appreciated. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Crawl diagnostic issue?
I'am sorry if my English isn't very good, but this is my problem at the moment: On two of my campagnes I get a weird error on Moz Analytics: 605 Page Banned by robots.txt, X-Robots-Tag HTTP Header, or Meta Robots Tag Moz Analytics points to an url that starts with: http:/**/None/**www.????.com. We don't understand how Moz indexed this non-existing page that starts with None? And how can we solve this error? I hope that someone can help me.
Intermediate & Advanced SEO | | nettt0 -
Panda recovery. Is it possible ?
Dear all, To begin, english is not my native language, so I'm very sorry if I make some mistake. On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic. The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that. So, in may, I have made a new version. Here you can see the most important modifications : A smallest header (-100px height). 2 columns website (the oldest website had 3 columns) I have deleted the category menu with the list of all categories and the alphabetical menu. less ads on the website (since few days I have also deleted the 2 adense blocks) The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only. I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex. I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used). I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam. Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment. All the merchants pages without promo codes have a noindex on the robot tag. Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes. Affiliate links are created on JS which open a new window (a redirect page with noindex). That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS... At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery. I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites... I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do. I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations. Many thanks for all. Sincerely, Florent
Intermediate & Advanced SEO | | Floroger0 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0