Static homepage content and javascript - is this technique obsolete?
-
Hi
Apologies beforehand for any minor forum transgressions - this is my first post.
I'm redesigning my blog and I have a question re static homepage content.
It used to be common practice in the online gambling sector (and possibly others) to have a block of 'SEO copy' at the footer of the homepage.
To 'trick Google' into thinking it was directly underneath the header, web devs would use javascript to instruct the html to load the div with the SEO copy first.
The logic was that this allowed for the prime real estate of the page to be used for conversion and sales, while still having a block of relevant copy to tell the spiders what the page was about, and to provide deep links into the site.
I attended a seminar just over a year ago at which some notable SEOs said that Google had probably worked this one out but it was impossible to tell. However, I've recently noticed that Everest Poker has what I think is the code commented out, and on PokerStars I can't find it at all (even in the includes).
I would be happy to post the Everest code but, while I've read the etiquette, I'm not 100% whether this is allowed.
So my question is... for the blog I'm redesigning, do I still need to follow this practice? I would prefer search engines saw some static intro text describing the site, rather than the blog posts, the excerpts of which will probably be canonicalized to the actual post pages to avoid duplication issues. But I would prefer this static content to appear below the fold.
What is current best practice here?
Alex
-
Thanks Edward
-
It would be possible to have the text at the beginning of the html document but then display it further down using CSS, not java script.
I don't think there is a massive need to do something like this. In the past Google may not have indexed all of the content from a page, especially if the document size was very large. This position trick would ensure that the important SEO focused content would be indexed.. if you build your site properly and take into account the size, page load speed, make sure the code is clean etc then there should be no need to move the content around.
-
Hi Vahe
Thanks for the response, and the article link - I'll take a look at that later.
However, I think you've misunderstood the situation. The content is not hidden - it's clearly visible and crawlable at the bottom of the page. However, it's placed in a div and that div is loaded immediately after the header, through the use of javascript.
I'm no javascript expert but Everest Poker appears to hvae commented the function out, and PokerStars appears to have removed it altogether.
If that is, in fact, what they've done (and I'm not misreading the code, which is possible), then my question is, does this 'trick' of placing text lower in the page, but telling spiders to crawl it first no longer work.
Hope that clears things up.
Alex
-
Hi Alex,
In my belief, unless served as alternative content, any hidden content is unethical SEO.
Have a go at content stacking - http://www.dummies.com/how-to/content/move-up-your-web-page-content-for-better-search-en.html
Hope this helps,
Vahe
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Benefit of internal link in content
Hi, Is there a real benefit to having internal links in content other than at the bottom of a page for example and not surrounded by content. Would the benefit be 1 to 10 or 1 to 1.5 ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Homepage only has disappeared from SERPs
I have a website that has been ranking really well with the home page for our most important keywords. 2nd position. A few days ago I noticed the home page was nowhere to be found in the SERPs. Not really knowing what to do I republished my Rapidweaver site and got the message that there were two index files. An index.html and index.php file which could cause problems. The home page has always been a html page so I have no idea how this was created. I deleted the php file from the file directory and resubmitted the homepage in webmaster tools for indexing. Within about half an hour the page was appearing in the SERPs again in my original position for the important keywords. Two days later and it's gone again. I've tried everything I can think of and resubmitted the page to Google for indexing. I can see it is being indexed as it comes up when I type the URL into the search bar but it will not come up in the search results either for my two keywords or the actual name of my business. When I type in business name it brings up lots of other pages from the site but not the home page. I've spoken to 3 SEO companies and nobody knows what is causing it. Please help with any suggestions this will definitely impact our business if we can't figure it out. Has this happened to anyone else? My website is NSFW but is www.aprilnites.com.au
Intermediate & Advanced SEO | | GemmaApril0 -
How to contact others for sharing my content ?
Hello 🙂 please can someone tell me how and in what way I can contact other websites related to my site and asking them, can they write a article or similar about my content . And can they link to me, quote me in their articles Or how to take my content to these other blogs, sites and show it off to them ? Thank you
Intermediate & Advanced SEO | | Ivek990 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page. The URLs are... mobile:
Intermediate & Advanced SEO | | grayloon
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!0 -
Techniques to fix eCommerce faceted navigation
Hi everyone, I've read a lot about different techniques to fix duplicate content problems caused by eCommerce faceted navigation (e.g. redundant URL combinations of colors, sizes, etc.). From what I've seen suggested methods include using AJAX or JavaScript to make the links functional for users only and prevent bots from crawling through them. I was wondering if this technique would work instead? If we detect that the user is a robot, instead of displaying a link, we simply display its anchor text. So what would be for a human COLOR < li > < a href = red >red < /a > < /li >
Intermediate & Advanced SEO | | anthematic
< li > < a href = blue>blue < /a > < /li > Would be for a robot COLOR < li > red < /li >
< li > blue < /li > Any reason I shouldn't do this? Thanks! *** edit Another reason to fix this is crawl budget since robots can waste their time going through every possible combination of facet. This is also something I'm looking to fix.0