Duplicate Content
-
I have a question about duplicate content. (auto generated text).
Will google consider page 1 and page 2 as duplicate content?Page 1.
You will find all the Amazon coupon codes and Amazon discount codes currently available listed below, if Amazon doesn't currently have any coupons available you may want to check for Amazon deals or find related coupon codes or promotional codes for similar online stores selling the same products as amazon.
We always have the latest coupon codes for Amazon which are updated daily, so if you can't find any Amazon coupons here then you won't find them anywhere else.
Shop online today at Amazon, and take advantage of the coupon codes that Amazon currently has on offer, these coupon codes, offer codes, and promo codes for Amazon may never be available again.Page 2.
You will find all the Target coupon codes and Target discount codes currently available listed below, if Target doesn't currently have any coupons available you may want to check for Target deals or find related coupon codes or promotional codes for similar online stores selling the same products as Target.
We always have the latest coupon codes for Target which are updated daily, so if you can't find any Target coupons here then you won't find them anywhere else.
Shop online today at Target, and take advantage of the coupon codes that Target currently has on offer, these coupon codes, offer codes, and promo codes for Target may never be available again. -
Sent you a PM
-
Hi,
Thanks so much!
Is possible to get in touch with you by email or skype?930240C809194680B0F8E988F699E00B.PROTECT # WHOISGUARD # COM Email used for thatswhatphilsaid
-
Each page with unique 300 words will be fine in google's eyes?
If you have 300 words on each page, as long as it's useful content that people are sticking around to read, then you should be okay. Your end goal should be to provide value to your visitors. If 300 words is plenty of content for the subject of your pages, then you're okay. If you have a blog about quantum physics and you only write 300 words per page... you might not be so okay anymoreAfter the text is removed is there any chance to recover from Panda? If your site is penalized by Panda, and you make adjustments to fix the issues you were once penalized for, yes, you can certainly recover. It's possible that duplicate content isn't your only issue, and there may be more to fix. Again, this is assuming you're penalized by Panda. I found a really good post about Panda recovery a couple weeks ago. Lucky for you, I bookmarked it! http://www.ventureharbour.com/panda-recovery-a-guide-to-recovering-googles-panda-update/
What about Page title and page meta description? I wouldn't personally write my titles and meta descriptions like that. It is probably a good idea to vary them up and make them a bit more unique from one another. If I'm being totally honest, I think your example title tags might work for Google. That would be up to you though if you're willing to take that chance. If everything else on your site is fantastic, and your only issue is those types of title tags, I really don't think Google would give you a problem. Either way, the best thing to do (obviously) is make them more unique. I'm not a personal fan of them being too similar, but I have seen it done like that on a site before and the pages ranked just fine (they were pretty low competition keywords though). Edit: This is the only question I'm not that sure about... your examples might be okay, but I don't want to give you bad advice.
This is my second question on MOZ
and your answered both of them.
Hooray! I hope I'm helping you outI've made it a goal of mine to make it to the top 50 in Moz Points before the end of 2014.
-
Thanks Philip,
So I need to get rid of this kind of text ( it was an example )
Each page with unique 300 words will be fine in google's eyes?After the text is removed is there any chance to recover from Panda?
What about Page title and page meta description.
Amazon Coupons and Discount Codes for March 2014
Latest Amazon coupons, promo codes and discounts for you to save! Last updated: March 2014.Target Coupons and Discount Codes for March 2014
Latest Target coupons, promo codes and discounts for you to save! Last updated: March 2014.still duplicate content?
This is my second question on MOZ
and your answered both of them.
-
The answer is a big, fat, juicy, YES. That is the epitome of duplicate content.
You need to write the content completely unique from the other page. You cannot trick Google. The Panda will bite you hard
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix duplicate content for homepage and index.html
Hello, I know this probably gets asked quite a lot but I haven't found a recent post about this in 2018 on Moz Q&A, so I thought I would check in and see what the best route/solution for this issue might be. I'm always really worried about making any (potentially bad/wrong) changes to the site, as it's my livelihood, so I'm hoping someone can point me in the right direction. Moz, SEMRush and several other SEO tools are all reporting that I have duplicate content for my homepage and index.html (same identical page). According to Moz, my homepage (without index.html) has PA 29 and index.html has PA 15. They are both showing Status 200. I read that you can either do a 301 redirect or add rel=canonical I currently have a 301 setup for my http to https page and don't have any rel=canonical added to the site/page. What is the best and safest way to get rid of duplicate content and merge the my non index and index.html homepages together these days? I read that both 301 and canonical pass on link juice but I don't know what the best route for me is given what I said above. Thank you for reading, any input is greatly appreciated!
On-Page Optimization | | dreservices0 -
Thin Content pages
I have a couple of pages that are thin content. One is essentially a page with the icons of our customers and a link out to their website. The other is a summary portfolio page that has some images of some of the client work we have done with links to internal pages that have more details about each client situation, approach, etc. These deeper pages are just fine. What is the recommendation for handling these thin content pages? We could add content, but then it wouldn't really help the user very much.
On-Page Optimization | | ExploreConsulting0 -
Duplicate Content, Same Company?
Hello Moz Community, I am doing work for a company and they have multiple locations. For example, examplenewyork.com, examplesanfrancisco.com, etc. They also have the same content on certain pages within each website. For example, examplenewyork.com/page-a has the same content as examplesanfrancisco.com/page-a Does this duplicate content negatively impact us? Or could we rank for each page within each location parameter (for example, people in new york search page-a would see our web page and people in san fran search page-a would see our web page)? I hope this is clear. Thanks, Cole
On-Page Optimization | | ColeLusby0 -
Duplicate Issue
Hello Mozzers! We have a client going through a website revamp. The client is The Michelangelo Hotel, and they are part of Star Hotels. Star Hotels plans to create a section on their site for The Michelangelo, as opposed to maintaining a stand alone site. They will then take the michelangelohotel.com domain, and point it to the corresponding pages on the Star site. The guest will key in www.michelangelohotel.com, and will see the same content that can be found on www.starhotel.com/en/michelangelo-hotel-new-york. The problem we have is this: Essentially the same content will be indexed twice, once on starhotels.com and once on michelangelohotel.com. This would seem to cause a duplicate content issue. What are your thoughts? Edit: I apologize, because I was not nearly clear enough here. The Star Hotels site will have 5 pages dedicated to The Michelangelo Hotel. The content will sit solely on that server as those 5 pages. Those 5 pages will each be indexed as 2 URLs. www.michelangelohotel.com <-> www.starhotels.com/en/michelangelo/ www.michelangelohotel.com/accommodations <-> www.starhotels.com/en/michelangelo/accommodations And so on. Thanks!
On-Page Optimization | | FrankSweeney0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Duplicate Content - Delete it or NoIndex?
Last month I realized that one of my freelancers had been feeding my website with copied / spun content and sadly, there's lots of it. And of course it got my website to be hit hard by the last Panda update. Now that I've identified the content, what the best thing to do? Should I delete it permanently and get 404 errors or should I set the pages' robot meta tag to "nofollow"?
On-Page Optimization | | sbrault740 -
Duplicated Page Content
I have encountered this weird problem about duplicate page content. My site got 3 duplicate content similar on the link structure below. If I'm going to use rel canonical does it help to resolve the duplication problem? Thanks http://www.sample.com http://www.sample.com/ http://www.sample.com/index.php
On-Page Optimization | | mattvectorbpo0 -
Does 301 generate organic content ?
I manage this domain name www.jordanhundley.com . Right now it is 301 to www.jordanhundley.net where I hosted the content for almost 18 months. At this point you are only able to read the 301 script if you use CTRL U at the .com domain. Does Google read the content beyond the script? Is the 301 website getting juice from the targeted domain ? This is the script I´m using <html> <head> <title>Jordan Hundleytitle> head> <frameset rows="100%,*" border="0"> <frame src="[http://www.jordanhundley.net](view-source:http://www.jordanhundley.net/)" frameborder="0" /> frameset><noframes>noframes> html>
On-Page Optimization | | mPloria0