Potential issue: Page design might look like keyword stuffing to a web crawler
-
We have an interesting design element we might try on our home page. Here's a mockup: https://codepen.io/dsbudiac/pen/Bwrgjd
I'm worried web crawlers will interpret this as keyword stuffing and affect our rankings. It features:
- Mostly transparent/hidden text
- Repeating keyword list
I could try a couple methods to skirt around crawling concerns:
- Load keywords through an iframe
- Make the keywords an image (would significantly increase page load)
- Inject keywords after page load into a container w/ javascript (prob not effective as crawlers are only getting better at indexing javascript)
- Load the keywords into an svg element
- Load the keywords into a canvas element via javascript
I have a few questions:
- Should I be concerned about any potential keyword stuffing / SEO issues with this design?
- Can you comment on the effectiveness (with proof) of the above strategies?
- Am I better off just abandoning this type of design?
-
Ah, a very interesting question!
I'd not be too concerned; you're loading the content in through a data attribute rather than directly as text. However, there are definitely a few options you could consider:
- Render via SVG feels like the safest bet, though that's going to be a pretty large, complex set of vectors.
- Save + serve as an image (and overcome the file size concerns by using WebP, HTTP/2, a CDN like Cloudflare, etc)
- Serve the content via a dedicated JavaScript file, which you could block access to via robots.txt (a bit fudgey!)
I'd be keen to explore #2 - feels like you should be able to achieve the effect you're after with an image which isn't ridiculously huge.
-
Never said the image option was hard. It's just not ideal as it increases page load and is less flexible. A noindex'd iframe seems to be the best option. We already have a working proof of concept, thanks.
-
As long as you don't use that text inside a header, link, or some relevant piece of content you don't have to worry about it. As I understand h1 is the main factor of Google to determine the main keyword of a specific page.
-
I thought about using googleon/googleoff tags, but apparently that's only for Google Search Appliance, and not traditional google search/index: https://webmasters.stackexchange.com/questions/54735/can-you-use-googleon-and-googleoff-comments-to-prevent-googlebot-from-indexing-p
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Job Posting Page and Structured Data Issue
We have a website where we do job postings. We manually add the data to our website. The Job Postings are covered by various other websites including the original recruiting organisations. The details of the job posting remain the same, for instance, the eligibility criteria, the exam pattern, syllabus etc. We create pages where we list the jobs and keep the detailed pages which have the duplicate data disallowed in robots.txt. Lately, we have been thinking of indexing these pages as well, as the quantum of these non-indexed pages is very high. Some of our competitors have these pages indexed. But we are not sure whether doing this is gonna be the right move or if there is a safe way to deal with this. Additionally, there is this problem that some job posts have very less data like fees, age limit, salary etc which is thin content so that might contribute to poor quality issue. Secondly, we wanted to use enriched result snippets for our job postings. Google doesn't want snippets to be used on the listing page: "Put structured data on the most detailed leaf page possible. Don't add structured data to pages intended to present a list of jobs (for example, search result pages). Instead, apply structured data to the most specific page describing a single job with its relevant details." Now, how do we handle this situation? Is it safe to allow the detailed pages which have duplicate job data and sometime not so high quality data in robots.txt?
Intermediate & Advanced SEO | | dailynaukri0 -
How to optimize count of interlinking by increasing Interlinking count of chosen landing pages and decreasing for less important pages within the site?
We have taken out our interlinking counts (Only Internal Links and not Outbound Links) through Google WebMaster tool and discovered that the count of interlinking of our most significant pages are less as compared to of less significant pages. Our objective is to reverse the existing behavior by increasing Interlinking count of important pages and reduce the count for less important pages so that maximum link juice could be transferred to right pages thereby increasing SEO traffic.
Intermediate & Advanced SEO | | vivekrathore0 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
Crawl Issue for Deleted Pages
Hi, sometimes, I just delete a page and not necessarily want to make a 404 to another page. So Google Webmaster Tools shows me 108 'not found' pages under 'Crawling Errors'. Is that a problem for my site?
Intermediate & Advanced SEO | | soralsokal
Can I ignore this with good conscience?
Shall I make 404 to my homepage? I am confused and would like to hear your opinion on this. Best, Robin0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Do pages with irrelevant keywords hurt the domain overall for ranking for relevant keywords?
I have been doing SEO for the University I work at. We are optimizing our degree pages on a page-by-page basis. So hypothetically we have a page optimized for "online accounting degree" and another for "online marketing degree", etc. Although our focus is on specific page optimization, we hope the by-product is that the whole domain will start to rank better for "online degree". First of all, is this a reasonable expectation? Second, if this IS the case, will pages full of irrelevant keywords hurt the overall strategy? For example, our registrar and financial aid PDFs that are full of legal/financial mumbo-jumbo. Are these lowering our keyword density of relevant keywords across the domain?
Intermediate & Advanced SEO | | SNHU0 -
Pagination and links per page issue.
Hi all, I have a listings based website that just doesn't seem to want to pass rank to the inner pages. See here for an example: http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links. The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results. My question is: Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)? And would using rel canonical on page numbers greater than 1 result in better trust/ranking? Thanks in advance.
Intermediate & Advanced SEO | | Mulith0 -
Page titles
Hi Guys, Hope your all well and business is good. I have been going through and changing page titles for my site which is currently huge attracting massive amounts of traffic. However from my pro membership i have notice a lot of the rankings in Google search engine has decreased. I have been using a strategy that i read on SEOMoz which is; example Keyword | Page heading | company name Is this why? if so what is the best method? I have changed nothing else so far.
Intermediate & Advanced SEO | | wazza19850