Best Way to Incorporate FAQs into Every Page - Duplicate Content?
-
Hi Mozzers,
We want to incorporate a 'Dictionary' of terms onto quite a few pages on our site, similar to an FAQ system.
The 'Dictionary' has 285 terms in it, with about 1 sentence of content for each one (approximately 5,000 words total).
The content is unique to our site and not keyword stuffed, but I am unsure what Google will think about us having all this shared content on these pages.
I have a few ideas about how we can build this, but my higher-ups really want the entire dictionary on every page. Thoughts?
Image of what we're thinking here - http://screencast.com/t/GkhOktwC4I
Thanks!
-
Me too! Where all my mozzers at?
-
We try to never use them for a few reasons, after talking to our dev team here...
Bad For SEO
Linking/Bookmarks
Difficulty with Debugging
No real performance gainsI would really consider a separate page that may well have some real SEO value with a few good terms and explanations on.
Would like to hear other opinions on this...
-
Another option I am thinking is to include this section in an iFrame, since I know iFrames are not read by search engines.
What do you think about that solution?
-
Ok, i see, it may be more useful to have them as a seperate page, but that is probably a whole different debate and highly subjective.
So what i have looked at is this...
...since there are so many legitimate uses for hiding content with
display: none;
when creating interactive features, that sites aren't automaticallypenalised for content that is hidden this way (so long as it doesn't look algorithmically spammy).Google's Webmaster guidelines also make clear that a good practice when using content that is initially legitimately hidden for interactivity purposes is to also include the same content in a
<noscript></code> tag, and Google recommend that if you design and code for users including users with screen readers or javascript disabled, then 9 times out of 10 good relevant search rankings will follow (though their specific advice seems more written for cases where javascript writes new content to the page).</em></p> <blockquote> <p><em><strong>JavaScript:</strong> Place the same content from the JavaScript in a tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.</em></p> </blockquote> <p><em>So, best practice seems pretty clear.</em></p> <p><em><strong>What I can't find out is</strong>, however, the simple factual matter of whether hidden content is indexed by search engines (but with potential penalties if it looks 'spammy'), or, whether it is ignored, or, whether it is indexed but with a lower weighting (<a href="http://webmasters.stackexchange.com/questions/1685/is-content-inside-a-noscript-tag-indexed-by-search-indexes">like <code><noscript></code> content is, apparently</a>).</em></p> <p>That was from another SEO site, what i would say is that Google doesn't 'penalise' for duplicate content so would it be a disaster to try it, see if it is picked up as dupe and then change if necessary?</p></noscript>
-
Thanks for the response.
Here is a crude image of what we're thinking - http://screencast.com/t/GkhOktwC4I
The text would be hidden/displayed via javascript, so it would not really affect the user's experience in a negative way.
-
Hi,
How would it be displayed? Wouldn't it be just as useful to have it open in a new window that the user could keep open? If you need to display 250+ words plus a sentence for each then the user would not be able to see the content they are/were interested in.
You could then have a link to it on each page....
Do your 'higher ups' embrace user experience and how it affects people's browsing? Maybe an education job... Good luck!
Not sure if that helped, but just my opinion.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will I be flagged for duplicate content by Google?
Hi Moz community, Had a question regarding duplicate content that I can't seem to find the answer to on Google. My agency is working on a large number of franchisee websites (over 40) for one client, a print franchise, that wants a refresh of new copy and SEO. Each print shop has their own 'microsite', though all services and products are the same, the only difference being the location. Each microsite has its own unique domain. To avoid writing the same content over and over in 40+ variations, would all the websites be flagged by Google for duplicate content if we were to use the same base copy, with the only changes being to the store locations (i.e. where we mention Toronto print shop on one site may change to Kelowna print shop on another)? Since the print franchise owns all the domains, I'm wondering if that would be a problem since the sites aren't really competing with one another. Any input would be greatly appreciated. Thanks again!
Intermediate & Advanced SEO | | EdenPrez0 -
What is the best way to structure website URLs ?
Hi, can anyone help me to understand if having category folder in URL matters or not? how to google treat a URL? for example, I have the URL www.protoexpress.com/pcb/certification but not sure google will treat it a whole or in separate parts? if in separate parts, is it safe to use pcb/pcb-certification? or it will be considered as keyword stuffing? Thank you in anticipation,
Intermediate & Advanced SEO | | SierraPCB1 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
How best to deindex tens of thousands of pages?
Hi there, We run a quotes based site and so have hundreds of thousands of pages. We released a batch of pages (around 2500) and they ranked really well. Encouraged by this we released the remaining ~300,000 pages in just a couple of days. These have been indexed but are not ranking any where. We presume this is because we released too much too quickly. So we want to roll back what we've done and release them in smaller batches. So I wondered if: 1. Can we de-index thousands of pages, and if so what's the best way of doing this? 2. Can we then re-index these pages but over a much greater time period without changing the pages at all - or would we need to change the pages/the URL's etc? thanks! Steve
Intermediate & Advanced SEO | | SteveW19870 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
Duplicate Content and Titles
Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?
Intermediate & Advanced SEO | | KarlBantleman0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
What is the best way to learn SEO?
I was wondering if it's worth taking an SEO Training course. If so is it better to take a live class or Online class. Or is better to just read all the SEO Books out there? Or is there a good video series anyone can recommend? What is the best way to learn SEO? I have a good understanding of SEO but I'm not a Pro ( Yet ). Obviously SEO is always evolving so even the Pro's are constantly updating their skill set but I want to make sure my foundation is solid and complete. Advice Please. Thank you all.
Intermediate & Advanced SEO | | bronxpad0