Content within JavaSccript code
-
I know that it is not a good practice to inlcude SEO content within JavaScript, but are there exceptions to what Google can spider or is it best to just avoid completely?
-
Thank you for the quick responses.
Thanks,
Matthew
-
From my perspective, whenever possible you want to make your JavaScript content degrade gracefully. Basically, if a visitor has JavaScript turned off, they still have access to the content they would see if JavaScript was turned on. This also eliminates the need to worry about whether searchbots can or cannot spider such content.
As an example, consider a JavaScript based image carousel. With JavaScript turned on, the user can cycle through the images in a (usually) user friendly way via the carousel controls. With JavaScript turned off, the user can still see the images, but may have to click on them to see them displayed in the browser window - not as slick, but not bad either.
One key aspect to this type of development is writing well organized markup / code / scripts that allows for your JavaScript to be seperated from your HTML.
A great resource for learning about this is DOM Scripting by Jeremy Keith.
-
Hi Mjmorse,
What do you mean by SEO content? Do you mean that your content is only targeted for search engine spiders?
In case your content is also targeting the actual real users that will visit your website, I suggest you avoid javascript for content in case some of them are using mobile device with limited javascript support.
Plain text content right in the HTML is always preferable over dynamicaly loaded content from JS. Using javascript is usualy a way to hide content to search engines, not the opposite.
Best regards,
Guillaume Voyer. -
A simple javascript that does something like document.write is fine. Google can execute a lot of javascript now. The key is to to webmaster tools and fetch the page as googlebot. Then you can see what Google can see on your page. If you have SEO content in a javascript and Google cannot see it, I would change it. If Google can see it and index it just fine, no problems.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical: Same content but different countries
I'm building a website that has content made for specific countries. The url format is: MyWebsite.com/<country name="">/</country> Some of the pages for <specific url="">are the same for different countries, the <specific url="">would be the same as well. The only difference would be the <country name="">.</country></specific></specific> How do I deal with canonical issues to avoid Google thinking I'm presenting the same content?
On-Page Optimization | | newbyguy0 -
Consolidating a Large Site with Duplicate Content
I will be restructuring a large website for an OEM. They provide products & services for multiple industries, and the product/service offering is identical across all industries. I was looking at the site structure and ran a crawl test, and learned they have a LOT of duplicate content out there because of the way they set up their website. They have a page in the navigation for “solution”, aka what industry you are in. Once that is selected, you are taken to a landing page, and from there, given many options to explore products, read blogs, learn about the business, and contact them. The main navigation is removed. The URL structure is set up with folders, so no matter what you select after you go to your industry, the URL will be “domain.com/industry/next-page”. The product offerings, blogs available, and contact us pages do not vary by industry, so the content that can be found on “domain.com/industry-1/product-1” is identical to the content found on “domain.com/industry-2/product-1” and so-on and so-forth. This is a large site with a fair amount of traffic because it’s a pretty substantial OEM. Most of their content, however, is competing with itself because most of the pages on their website have duplicate content. I won’t begin my work until I can dive in to their GA and have more in-depth conversations with them about what kind of activity they’re tracking and why they set up the website this way. However, I don’t know how strategic they were in this set up and I don’t think they were aware that they had duplicate content. My first thought would be to work towards consolidating the way their site is set up, so we don’t spread the link-equity of “product-1” content, and direct all industries to one page, and track conversion paths a different way. However, I’ve never dealt with a site structure of this magnitude and don’t want to risk messing up their domain authority, missing redirect or URL mapping opportunities, or ruin the fact that their site is still performing well, even though multiple pages have the same content (most of which have high page authority and search visibility). I was curious if anyone has dealt with this before and if they have any recommendations for tackling something like this?
On-Page Optimization | | cassy_rich0 -
Duplicate Content on our own website
Our website sells tickets for events. We also have an news articles section with information about events / artists / venues. From time to time we release a product page and a related news article on a separate page. Some of the content in the news article would be perfect for our product page. Essentially its our product page we want too rank. Would it harm our SEO if we had some of the same content on both of these pages?
On-Page Optimization | | Alexogilvie0 -
Duplicate content in the title
Good morning, I am developing an application that searches offers in the press. The problem I have is the follow one:
On-Page Optimization | | ofuente
When I find an offer that I have already post, I cant use the same URL because it generates duplicate content , as the URL is generated from the title. If I find two offers in different stores (for example Thomson TV) I am studying two options. The first would be to add a number at the end of the URL
http://www.offertazo.com/televisor-thomson
http://www.offertazo.com/televisor-thomson1
http://www.offertazo.com/televisor-thomson2 Another option I propose would be to add semantic data to provide value (such as the date). For example:
http://www.offertazo.com/01-12-12/televisor-thomson I appreciate your help.0 -
Does Google still see masked domains as duplicate content?
Older reads state the domain forwarding or masking will create duplicate content but Google has evolved quite a bit and I'm wondering if that is still the case? Not suggesting that a 301 is not the proper way to redirect something but my question is: Does Google still see masked domains as duplicate content? Is there any viable use for domain masking other than for affiliates?
On-Page Optimization | | TracyWeb0 -
Duplicate Page Content Issue
For one of our campaigns, we have 164 errors for Duplicate Page Content. We have a website where much of the same content lives in two different places on their website. The information needs to be accessible from both areas. What is the best way to tackle this problem? Is there anything that can be done so these pages are not competing against one another? If the only solution is to edit the content on one of the pages, how much of the content has to be different? Is there a certain percentage to go by? Here is an example of what I am referring to: 1.) http://www.valleyorthopedicassociates.com/services/foot-center/preventing-sprains-and-strains 2.) http://www.valleyorthopedicassociates.com/patient-resources/service/foot-and-ankle-center/preventing-sprains-and-strains
On-Page Optimization | | cmaseattle1 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0 -
The SEO and CRO Value of an Image Next to Page Content
If given the choice to add an attractive stock photo to a conversion focused page, do the pros out number the cons in terms of SEO and CRO. Some pros are that you can include the keyword in the image filename and image alt tag. It can also increase user experience by making the page more attractive. Some cons might be that it increases page load time which can have a negative impact on SEO and user expereience. Also the visitor might get distracted away from the lead form button.
On-Page Optimization | | SparkplugDigital0