Is is it true that Google will not penalize duplicated content found in UL and LI tags?
-
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
-
lol... thanks for that report.
Should we go back and read for the laughs?
-
I just read several more articles on that site. Overall junk. I would find a new blog to get your info from.
-
** In that case you can use “li” and “ul” tag, moreover Google doesn’t penalize for repeating words under these tags.**
ha ha... that is B.S.
The author of that does not know how Google handles
-
and
I can imagine Matt Cutts telling people ... "Its OK to stuff the
- tag guys"
-
-
Thanks for the response,
I've found it here http://www.dailytechpost.com/index.php/8-best-tips-for-css-for-seo/#comment-69311 amongst several other places. I'm not in to stuffing keywords and fully aware that writing natural prose is the way to go, it was more a reference for where there is an excessive amount of keywords coincidently, such as when using technical terms which cannot be substituted and form part of every element of a text. Or perhaps if you are talking about a concept and natural prose feels a little repetitive, such as writing about infographics.
-
Maybe they are not today. I'm not to sure about this like the others I'm asking myself who told you this.
I do recommand you do not to try fooling the big G around. Duplicate content is kind of not so valuable content in the best case. You should use your efforts building great content instead of trying to duplicate.
Because even if it was the case they are not doing it right now, they probably will one day.
From my experience, duplicate is duplicate anywhere you put it !
-
Exactly. **Content is written for the visitors, not the search engines. **
If you are familiar with the subject and are writing naturally, the content will do just fine with all of the search engines, and more importantly your visitors.
-
Where did you hear this at? That makes no sense and I have never heard anything like that.
And do not stuff keywords or even try to see if you can get away with it. Thats poor optimization and does not look well for users. Write and design for your users and you should be fine.
-
I have never heard that
-
are safe for anything.
Don't bet on the behavior of Google.
Also, I don't pay any attention to the number of times that I use a word in copy. None. I try to write naturally without regard for search engines.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content - default.html
I am showing a duplicate content error in moz. I have site.com and site.com/default.html How can I fix that? Should I use a canonical tag? If so, how would i do that?
On-Page Optimization | | bhsiao0 -
Do we need to worry about internal duplicate content?
Hi, I have a question about internal duplicate content. We have a catalogue of around 4000 products. Most of these do have individual descriptions but for most of the products they contain a generic summary that includes a sentence to begin with that includes each product name. We're currently working on descriptions for each product, but as you can imagine it's quite a chore. I was wondering if there are actually any penalties for this or whether we can ignore the crawl errors from the moz report? Thanks in Advance!
On-Page Optimization | | 10dales0 -
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s, restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
Any idea how Google is doing this? Is it schematic? http://techcrunch.com/2014/02/28/google-adds-full-restaurant-menus-to-its-search-results-pages/
Google is now showing menus on select searches. Any idea how they are getting this information? I would like to make sure my clients get visibility this way.
On-Page Optimization | | Ron_McCabe0 -
Social Media Profile and Website content dupe content issues ?
Is there a dupe content issue if your social media profiles and website have same content/copy ? For example if you have written great copy for your website home page then literally copy & paste that into your G+ co page introduction (since sums everything up perfectly and don't want to change it) would that have a negative effect or be perfectly ok ? cheers dan
On-Page Optimization | | Dan-Lawrence1 -
Duplicate Page Content
Hi there, We keep getting duplicate page content issues. However, its not actually the same page.
On-Page Optimization | | HamiltonIsland
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content?0 -
What content is apropriate here
Hello, I've got a dozen good articles in my article section, but nobody is landing on them. Should we write articles about our products? Won't that compete with our product pages?
On-Page Optimization | | BobGW0 -
Fixed horrible title tag on home page, and lost ranking. Will it come back?
I was helping out someone on their site and its home page ranked on page 2 for their term, and the title tag was horrible. It was 160 characters long with lots of near repetitive keywords ([keyword] - adjective [keyword] - adjective [keyword] - adjective [keyword] etc.) -- typical title that Google would penalize when it got around to it. So I created a title that made sense, for the keyword, and that followed the best practices of Google recommendations. Now it's dropped off the index. (EDIT: sorry, still in index, just not even in top 1000) Is this something I should not have done? I was just trying to keep them from getting slammed. And, how long should I expect it to take to get my ranking back? This is the only page title I changed.
On-Page Optimization | | bizzer0