Is is it true that Google will not penalize duplicated content found in UL and LI tags?
-
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
-
lol... thanks for that report.
Should we go back and read for the laughs?
-
I just read several more articles on that site. Overall junk. I would find a new blog to get your info from.
-
** In that case you can use “li” and “ul” tag, moreover Google doesn’t penalize for repeating words under these tags.**
ha ha... that is B.S.
The author of that does not know how Google handles
-
and
I can imagine Matt Cutts telling people ... "Its OK to stuff the
- tag guys"
-
-
Thanks for the response,
I've found it here http://www.dailytechpost.com/index.php/8-best-tips-for-css-for-seo/#comment-69311 amongst several other places. I'm not in to stuffing keywords and fully aware that writing natural prose is the way to go, it was more a reference for where there is an excessive amount of keywords coincidently, such as when using technical terms which cannot be substituted and form part of every element of a text. Or perhaps if you are talking about a concept and natural prose feels a little repetitive, such as writing about infographics.
-
Maybe they are not today. I'm not to sure about this like the others I'm asking myself who told you this.
I do recommand you do not to try fooling the big G around. Duplicate content is kind of not so valuable content in the best case. You should use your efforts building great content instead of trying to duplicate.
Because even if it was the case they are not doing it right now, they probably will one day.
From my experience, duplicate is duplicate anywhere you put it !
-
Exactly. **Content is written for the visitors, not the search engines. **
If you are familiar with the subject and are writing naturally, the content will do just fine with all of the search engines, and more importantly your visitors.
-
Where did you hear this at? That makes no sense and I have never heard anything like that.
And do not stuff keywords or even try to see if you can get away with it. Thats poor optimization and does not look well for users. Write and design for your users and you should be fine.
-
I have never heard that
-
are safe for anything.
Don't bet on the behavior of Google.
Also, I don't pay any attention to the number of times that I use a word in copy. None. I try to write naturally without regard for search engines.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify reported by Moz
According to Moz crawl report, there are hundreds of duplicate pages in our Shopify store ewatchsale.com. The main duplicate pages are:
On-Page Optimization | | ycnetpro101
https://ewatchsale.com/collections/seiko-watches?page=2
https://ewatchsale.com/collections/all/brand_seiko
(the canonical page should be https://ewatchsale.com/collections/seiko-watches) https://ewatchsale.com/collections/seiko-watches/gender_mens
(the canonical page should be https://ewatchsale.com/collections/seiko-watches/mens-watches) Also, I want to exclude indexing of pages URLs with "filter parameters" like https://ewatchsale.com/collections/seiko-watches/color_black+mens-watches+price_us-100-200 Shopify advised we can't access our robots.txt file. How can we exclude SE crawling of the page URLs with filter names?
How can we access the robots.txt file?
How can we add canonical code to the preferred collection pages? Which templates and what codes to add? Thanks for your advice in advance!0 -
How to organise subpages for good SEO content without duplicate text?
We are working on many subpages for our services. We have original content for each page however there are few text which we need to always duplicate like: Contact sales window, why to choose us window, supported files etc. What's the best way to do this so it's not consider as duplicated text. Should we redirected it or add it as a picture and always change name of the picture? Thank you Lukas
On-Page Optimization | | Lukas-ST0 -
Duplicate Title Tags/Meta Tags for Website with Multiple Locations
I currently have an insurance website that has over 40 offices in Ontario. The site also provides online quoting.
On-Page Optimization | | MainstreamMktg
In some of our programming, we have implemented variables in the URLS that will allow the quotes to go to the specific city offices - the issue I am having is that the quote in itself is the same quote form (same title, same meta) because it's technically one page on the website. We did it this way to avoid having to update 40 forms if a field on the form were to change. Is there any way I can relieve my site of this duplicate title tag/meta tag issue? Any insight would be really appreciated - thanks so much!0 -
How to remove duplicate content issues for thin page(containing oops no resulting found)
In this scenarios we have multiple different URLs but the page content is rendering same (containing Oops message) due to which content duplicate's issue arises.As soon as content for these URL,s are available then those pages duplicate issue will removed. So we want to remove duplicate issue not the page & Page URLs.
On-Page Optimization | | surabhi60 -
Not sure if I need to be concerned with duplicate content plus too many links
Someone else supports this site in terms of making changes so I want to make sure that I know what I am talking about before I speak to them about changes. We seem to have a lot of duplicate content and duplicate titles. This is an example http://www.commonwealthcontractors.com/tag/big-data-scientists/ of a duplicate. Do I need to get things changed? The other problem that crops up on reports is too many on page links. I am going to get shot of the block of tags but need to keep the news. Is there much else I can do? Many thanks.
On-Page Optimization | | Niamh20 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Duplicate title tag
Hello,
On-Page Optimization | | JohnHuynh
My site have problems with duplicate title, they were reported from google webmaster. For example:
/extra-services/car-pick-up-service-146.html (1) /extra-services/transportation--car-rails--146.html (2) According to my sitemap the first URL is right (1). But the second URL is wrong, I don't know it occur here.0 -
Duplicate Content
Part of a site I am working on, features many different bags in all thicknesses colors and sizes. I'm getting an error when some pages have different content like different thicknesses. The only differences between the pages are a single digit - but in trash bags that makes it a whole different product! I can't do a canonical because it's not the same. For example: http://www.plasticplace.net/index.php?file=productdetail&iprod_id=274 and http://www.plasticplace.net/index.php?file=productdetail&iprod_id=268 Any ideas?
On-Page Optimization | | EcomLkwd0