Is is it true that Google will not penalize duplicated content found in UL and LI tags?
-
I've read in a few places now that if you absolutely have to use a key term several times in a piece of copy, then it is preferable to use li and ul tags, as google will not penalise excessive density of keywords found in these tags. Does anyone know if there is any truth in this?
-
lol... thanks for that report.
Should we go back and read for the laughs?
-
I just read several more articles on that site. Overall junk. I would find a new blog to get your info from.
-
** In that case you can use “li” and “ul” tag, moreover Google doesn’t penalize for repeating words under these tags.**
ha ha... that is B.S.
The author of that does not know how Google handles
-
and
I can imagine Matt Cutts telling people ... "Its OK to stuff the
- tag guys"
- tag guys"
-
-
Thanks for the response,
I've found it here http://www.dailytechpost.com/index.php/8-best-tips-for-css-for-seo/#comment-69311 amongst several other places. I'm not in to stuffing keywords and fully aware that writing natural prose is the way to go, it was more a reference for where there is an excessive amount of keywords coincidently, such as when using technical terms which cannot be substituted and form part of every element of a text. Or perhaps if you are talking about a concept and natural prose feels a little repetitive, such as writing about infographics.
-
Maybe they are not today. I'm not to sure about this like the others I'm asking myself who told you this.
I do recommand you do not to try fooling the big G around. Duplicate content is kind of not so valuable content in the best case. You should use your efforts building great content instead of trying to duplicate.
Because even if it was the case they are not doing it right now, they probably will one day.
From my experience, duplicate is duplicate anywhere you put it !
-
Exactly. **Content is written for the visitors, not the search engines. **
If you are familiar with the subject and are writing naturally, the content will do just fine with all of the search engines, and more importantly your visitors.
-
Where did you hear this at? That makes no sense and I have never heard anything like that.
And do not stuff keywords or even try to see if you can get away with it. Thats poor optimization and does not look well for users. Write and design for your users and you should be fine.
-
I have never heard that
-
are safe for anything.
Don't bet on the behavior of Google.
Also, I don't pay any attention to the number of times that I use a word in copy. None. I try to write naturally without regard for search engines.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does javascript generated content consider as regular content?
The website mentioned below, the content is generated using javascript, and content is something to do with Unicode char. The Unicode content creates as you scroll down. Will this content affect SEO https://www.myweirdtext.com/
On-Page Optimization | | teenmass423230 -
Do permanent redirect solve the issue of duplicate content?
Hi, I have a product page on my site as below. www.mysite.com/Main-category/SubCatagory/product-page.html This page was accessible in both ways as below. 1. www.mysite.com/Main-category/SubCatagory/product-page.html 2. www.mysite.com/Main-category/product-page.html This was causing duplicate title issue. So i permanently redirected one to other. But after more than a month and after many crawls, webmaster tools html improvement still shows duplicate title issue. My question is that do permanent redirect solve duplicate content issue or something i am missing here?
On-Page Optimization | | Kashif-Amin0 -
Duplicate Content
I run a Business Directory, Where all the businesses are listed. I am having an issue.. with Duplication content. I have categories, Like A, B, C Now a business in Category A, User can filter it by different locations, by State, City, Area So If they filter it by the State and the State has 10 businesses and all of them are in one City. Both of the page
On-Page Optimization | | Adnan4SEO
The state filtered and the city filtered are same, What can i do to avoid that? canonical-url-tag or changing page Meta's and Body text? Please help 🙂0 -
2000 Active pages 404 on LIVE Ecommerce site - what will google do now?
Hi All, One of my ecommerce site having more than 20,000 pages from that one of the categories having 2000 pages showing 404 and still taking time for developer to fix this issue and may be they will be able to fix after 2-3 days so is this okay with google or google will take any action during this period? Thanks! Dev
On-Page Optimization | | devdan0 -
Photo Gallery with Duplicate Content and Titles
I have a photo Gallery that is coming up as a lot of Duplicate Titles and Page Content and fixing each photo just isn't possible right now. Should I just block the search engines from indexing them to resolve the errors?
On-Page Optimization | | NeilBelliveau0 -
E-commerce site product descriptions and duplicate content
Hi everyone. I'm developing an e-commerce site using Prestashop and concerned about the issue of duplicate content among product descriptions. My main concerns are: If there are 500 or more products and those product descriptions are obtained from a manufacturer or supplier's website hence running into external duplicate content issues. Internal duplicate content is also an issue, if there are multiple similar products and each product has the same description across several pages. What would be the best approach to eliminate the possibility of incurring a duplicate content penalty due to similar product descriptions? I've already considered the suggestion of noindex-ing the complete range of products to help protect from duplicate content penalties and having unique articles written in the site blog discussing products instead linking to certain products on the site. Another consideration I had was noindex-ing all product pages except pages for featured products in the store and rewriting descriptions for a set amount of those featured products regularly (this will still have the problem of internal duplicate content across pages if similar product descriptions are rewritten). The product range is intended to be very large so I'm really seeking an alternative solution from the insane task of rewriting many product descriptions. Any suggestions to make SEO work efficient are very much welcome and appreciated. Thank you!
On-Page Optimization | | valuepets0 -
Google Indexing
Hi, We recently launched a new version of our site on the Magento platform. I submitted a new sitemap and on the first crawl only 7 pages out of 132 were indexed...a few days later and we now have 107 indexed (phew). My question is this....how on earth do i find out which pages are indexed and more importantly not indexed? For all i know they might be really important ones so I need to be able to identify the missing pages so i can work on getting them indexed. Nic
On-Page Optimization | | nicc19760 -
What is the best way to manage industry required duplicate Important Safety Information (ISI) content on every page of a site?
Hello SEOmozzer! I have recently joined a large pharmaceutical marketing company as our head SEO guru, and I've encountered a duplicate content related issue here that I'd like some help on. Because there is so much red tape in the pharmaceutical industry, there are A LOT of limitations on website content, medication and drug claims, etc. Because of this, it is required to have Important Safety Information (ISI) clearly stated on every page of the client's website (including the homepage). The information is generally pretty lengthy, and in some cases is longer than the non-ISI content on each page. Here is an example: http://www.xifaxan.com/ All content under the ISI header is required on each page. My questions are: How will this duplicated content on each page affect our on-page optimization scores in the eyes of search engines? Is Google seeing this simply as duplicated content on every page, or are they "smart" enough to understand that because it is a drug website, this is industry standard (and required)? Aside from creating more meaty, non-ISI content for the site, are there any other suggestions you have for handling this potentially harmful SEO situation? And in case you were going to suggest it, we cannot simply have an image of the content, as it may not be visible by all internet users. We've already looked into that 😉 Thanks in advance! Dylan
On-Page Optimization | | MedThinkCommunications0