Is is it true that Google will not penalize duplicated content found in UL and LI tags?
-
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
-
lol... thanks for that report.
Should we go back and read for the laughs?
-
I just read several more articles on that site. Overall junk. I would find a new blog to get your info from.
-
** In that case you can use “li” and “ul” tag, moreover Google doesn’t penalize for repeating words under these tags.**
ha ha... that is B.S.
The author of that does not know how Google handles
-
and
I can imagine Matt Cutts telling people ... "Its OK to stuff the
- tag guys"
- tag guys"
-
-
Thanks for the response,
I've found it here http://www.dailytechpost.com/index.php/8-best-tips-for-css-for-seo/#comment-69311 amongst several other places. I'm not in to stuffing keywords and fully aware that writing natural prose is the way to go, it was more a reference for where there is an excessive amount of keywords coincidently, such as when using technical terms which cannot be substituted and form part of every element of a text. Or perhaps if you are talking about a concept and natural prose feels a little repetitive, such as writing about infographics.
-
Maybe they are not today. I'm not to sure about this like the others I'm asking myself who told you this.
I do recommand you do not to try fooling the big G around. Duplicate content is kind of not so valuable content in the best case. You should use your efforts building great content instead of trying to duplicate.
Because even if it was the case they are not doing it right now, they probably will one day.
From my experience, duplicate is duplicate anywhere you put it !
-
Exactly. **Content is written for the visitors, not the search engines. **
If you are familiar with the subject and are writing naturally, the content will do just fine with all of the search engines, and more importantly your visitors.
-
Where did you hear this at? That makes no sense and I have never heard anything like that.
And do not stuff keywords or even try to see if you can get away with it. Thats poor optimization and does not look well for users. Write and design for your users and you should be fine.
-
I have never heard that
-
are safe for anything.
Don't bet on the behavior of Google.
Also, I don't pay any attention to the number of times that I use a word in copy. None. I try to write naturally without regard for search engines.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tag Clouds in Google Despite Canonical Links for Single Tags/Articles
I am frustrated to see a lot tag clouds in Google even though I programmed my tagged pages to display a canonical link to the linking article if the is only one result for the tag cloud. The goal to to make sure that the article, which is of better quality than the tag page, ends up in Google without a bunch of thin tag pages getting in there. For instance this article should be in Google and this tag should not be because that tag has a canonical URL for that article. I do not have a lot of experience with tag cloud SEO because I prefer to limit such pages to categories, but I have found tag clouds to be important for aggregating information for specific issues, people, or places that are not already a site category. Some tags I have used to power social media pages that update automatically from RSS feeds for their related tag archives. That is quite useful for pages like that. Should I start using Meta noindex for those instead of rel canonical? I have already done that for author profiles because author profiles get a lot of on site links compared to individual articles because my gridviews use javascript for paging. The same is true for the tags, so if a tag is tagged in 30 articles it will have links from 30 articles but if those articles are not in the latest 20 for that tag only the latest 20 will have links back from the tag archive. I also suspect having a lot of tag pages with little content to negatively impact my indexing rate. I will see a number of recent tag pages added before new articles.
On-Page Optimization | | CopBlaster.com0 -
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
When making content pages to a specific page; should you index it straight away in GSC or let Google crawl it naturally?
On-Page Optimization | | Jacksons_Fencing0 -
Duplicate Content for Event Pages
Hi Folks, I have event pages for specific training courses running on certain dates, the problem I have is that MOZ indicates that I have 1040 duplicate content issues because I'm serving pages like this https://purplegriffon.com/event/2521/mop-practitioner I'm not sure how best to go about resolving this as, of course, although each event is unique in terms of it's start date, the courses and locations could be identical. Will Google penalise us for these types of pages, or will they even index them? Should I add a canonical link to the head of the document pointing to the related course page such as https://purplegriffon.com/courses/project-management/mop-management-of-portfolios/mop-practitioner. Will this solve the issue? I'm a little stuck on what to do for the best. Any advice would be much appreciated. Thanks. Kind Regards Gareth Daine
On-Page Optimization | | PurpleGriffon0 -
Duplicate Content - But it isn't!
Hi All, I have a site that releases alerts for particular problem/events/happenings. Due to legal stuff we keep the majority of the content the same on each of these event pages. The URLs are all different but it keeps coming back as duplicate content. The canonical tag is not right (i dont think for this) egs http://www.holidaytravelwatch.com/alerts/call-to-arms/egypt/coral-sea-waterworld-resort-sharm-el-sheikh-egypt-holiday-complaints-july-2014 http://www.holidaytravelwatch.com/alerts/call-to-arms/egypt/hotel-concorde-el-salam-sharm-el-sheikh-egypt-holiday-complaints-may-2014
On-Page Optimization | | Astute-Media0 -
Duplicate Content
I'm currently working on a site that sells appliances. Currently, there are thousands of "issues" with this site, many of them dealing with duplicate content. Now, the product pages can be viewed in "List" or "Grid" format. As Lists, they have very little in the way of content. My understanding is that the duplicate content arises from different URLs going to the same site. For instance, the site might have a different URL when told to display 9 items than when told to display 15. This could then be solved by inserting rel = canonical. Is there a way to take a site and get a list of all possible duplicates? This would be much easier than slogging through every iteration of the options and copying down the URLs. Also, is there anything I might be missing in terms of why there is duplicate content? Thank you.
On-Page Optimization | | David_Moceri0 -
tagged as duplicate content?
Hello folks, I'm new to SEOmoz . I was looking at our Crawl Diagnostics and found that some of our blog posts that have been commented on were tagged as duplicate content. For example: http://thankyouregistry.com/blog/remarriages-and-gift-registries/ http://thankyouregistry.com/blog/remarriages-and-gift-registries/comment-page-1/ I'm unsure how to fix these, so any ideas would be appreciated. Thanks a lot!
On-Page Optimization | | GiftReg0 -
Duplicate content problem
I am having an issue with duplicate content that I can't seem to figure out. I got rid of the www.mydomain.com by modifying the htaccess file but I can't figure out how to fix theproblem of mydomain.com/ and mydomain.com
On-Page Optimization | | ayetti0 -
How can I stop google reading a certain section of text with my H1 tag?
Hey Mozzers, I'm wondering if anybody knows of a way that I can stop google reading a certain part of text within my H1 texts? My issue is that I have individual office pages on my site, but many offices are based in the same city; such as 'London'. I want to keep London within the H1 tag for user experience but I do not want it to be picked up by the search engines and start a canonical issue. I've seen some people say to use document.write or use an image. Does anybody know of a correct way of doing this? Many Thanks.
On-Page Optimization | | Lakeside0