Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Script tags and seo
-
Hi,
I have a page on my site with a google map embed, and a path drawn on the map. The path is made from a long string of coordinates. For ease I have the co-ordinates placed in a script tag at the foot of the page, amongst my javascript
My question is, will this script tag hurt the seo for the page? I've read that inline js and 'data islands' can be bad, so I've been careful to keep it out of the main body of the page. Thanks, any help appreciated!
-
Inline scripts aren't bad per se, search engines just can't always understand them. Worst case scenario: you have extra code that Google has to crawl but doesn't understand, which takes up bandwidth and doesn't add value. But, it won't lower your rankings.
So, do whatever you need to do to deliver the best user experience you can on your site with this map and related route, and figure that Google will ignore it (Google is trying to understand it, though, so it may be helpful in the long run). Then, for search engines, include some text content describing the map and the route so that search engines can send the right searchers to your page.
Good luck!
Kristina
- topic:timeago_earlier,27 days
-
Okay great, that's very helpful.
What if I wanted to have multiple scripts, say, for points of interest along the route, and had multiple (20+) tags at the bottom of the page? Would this be an ugly way of doing it, or considered totally okay in the eyes of google?
-
Yes, that's an inline script (putting it at the top or bottom will still be inline), but as I said, if only one page is using that script, you are good to go. There's nothing bad in using inline scripts if they aren't going to be used on other pages as well.
-
Thanks Federico.
As my script is being called at the bottom of the page, I would assume it doesn't count as 'inline'?
Yes the scripts are only being used once on specific pages.
-
Inline scripts are bad if you are bringing them on every page, if that's the case, just use scrip embedding so users don't need to download the scripts EVERY time they see a page.
But, if the inline script is used only on a specific page and not reused, then there's no reason to load it as an external file. In my opinion, that will even need an extra server call to bring a code that only works on that page.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mega Menus and SEO
Hi Everyone, I know this has been brought up before, but wanted your opinion for 2020. I have a new client that is hesitant to do a mega menu for their huge site due to the amount of links and "dilution". I have quite a few clients with mega menus with no problems at all from an SEO standpoint. But I can understand his perspective. I am suggesting that we have the main links (looking at GA) as the the navigation, then clicking them takes you to subcategory page listing all the subcats within. Problem is that the developer/designer has made this mega menu already and it is pretty slick. Now they already are killing it search-wise on Google, but don't have a mega menu or a secondary category page. Just a a category with too many products, so we are trying to go one way or the other. Any opinions on which route to best take from a user and SEO perspective?
Web Design | May 27, 2020, 11:56 AM | vetofunk0 -
Have Your Thoughts Changed Regarding Canonical Tag Best Practice for Pagination? - Google Ignoring rel= Next/Prev Tagging
Hi there, We have a good-sized eCommerce client that is gearing up for a relaunch. At this point, the staging site follows the previous best practice for pagination (self-referencing canonical tags on each page; rel=next & prev tags referencing the last and next page within the category). Knowing that Google does not support rel=next/prev tags, does that change your thoughts for how to set up canonical tags within a paginated product category? We have some categories that have 500-600 products so creating and canonicalizing to a 'view all' page is not ideal for us. That leaves us with the following options (feel it is worth noting that we are leaving rel=next / prev tags in place): Leave canonical tags as-is, page 2 of the product category will have a canonical tag referencing ?page=2 URL Reference Page 1 of product category on all pages within the category series, page 2 of product category would have canonical tag referencing page 1 (/category/) - this is admittedly what I am leaning toward. Any and all thoughts are appreciated! If this were in relation to an existing website that is not experiencing indexing issues, I wouldn't worry about these. Given we are launching a new site, now is the time to make such a change. Thank you! Joe
Web Design | Feb 7, 2023, 8:46 PM | Joe_Stoffel1 -
How is Single Page Application (SPA) bad for SEO
Hi guys. I am quite inspired of SPA technique. It's really amazing when all your interaction with the site is going on the fly and you don't see any page reloads. I've started implementing the site with this instruction and already found nice guys to make the design. The only downside of the using SPA which I can see **is the **SEO part. That's because the URL does not really change and different pages don't have their unique URL addresses.
Web Design | Jul 21, 2017, 11:03 AM | Billy_gym
Actually they have, but it looks like: yoursite.com/#/products yoursite.com/#/prices yoursite.com/#/contact So all of them goes after # and being just anchors. For Google this mean all of these pages is just yoursite.com/ My question is what is really proven method to implement the URL structure in Single Page Application, so all the pages indexed by Google correctly (sorry I don't mention the other search engines because of market share). The other question, of course, is examples. It will be great to see real life site examples, better authority sites, which use SPA technique and well indexed by search engines.1 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | Feb 19, 2016, 2:10 PM | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Wordpress - redirecting tags
I just ran a webmaster tool from Yoast SEO premium and notice I have a lot of problems with tags (restricted-robots-txt) For example : http://www.soobumimphotography.com/tag/wedding-group-photo/ Do I have to redirect to http://www.soobumimphotography.com/wedding-group-photo/ Should I do this to each and every posts Thank you
Web Design | Mar 28, 2014, 8:42 AM | soobumim0 -
From Google Sites to Wordpress - Anyone Ventured this SEO terrain?
We have a few sites in Google Sites - and they are ugly! We have a majority (40+) of websites in Wordpress. But we have a few websites just stuck on Google Sites, and since Google won't let you fully edit the HTML, add scripts, or implement any technology since 2000, we want to move. The sad problem - the Google sites are ranking well. We rank well in Manhattan, Atlanta, Dallas, and Philadelphia. The problem is - the sites do not give much room for growth - and the bounce rate is high because they are so ugly. Has Anyone moved from Google sites to Wordpress? Should we just stay with Google and bite the ugly bullet? My fear is that these sites will not allow for growth. It is hard to update them and even harder to make them look nice. To get a sample - beware: www.counselingphiladelphia.com Even another reason to leave: The slider is non-semantic and terrible SEO. Google won't allow a slider script with tags and a hrefs, so the only way to implement a slider is through a Google Docs Presentation that keeps sliding. I know - terrible SEO (#donthate) but we needed something. Any advice and thoughts would help! Thanks Mozzers!
Web Design | Mar 28, 2013, 1:02 PM | _Thriveworks0 -
SEO downsides to minimalist (copy-light) homepage?
Curious for your thoughts on this - are there any SEO downsides to not having any substantive content on the home page (big background design)? We would obviously have appropriate page titles and link structure, etc. Our guess is that if the home page doesn't have much copy, that odds are that other specific pages will tend to perform better for non-brand search terms, which seems OK. If people DO find the homepage, it would likely be a brand search or an ad referral, in which case the minimalist, non-copy design would be conversion-friendly. Does that theory hold any water? I suppose a middle ground might be a single H1 line unobtrusively on the page. Thanks in advance for any insight, guys! Sincerely, Stephen
Web Design | Jul 10, 2012, 3:36 AM | PerfectPitchConcepts0 -
Drop Down Menus & SEO?
Do these typically have a negative impact on SEO? I know this is kind of a vague question, does it make it harder to spider? Are there SEO friendly ways of coding these? There are so many sites out there that have these, so I've got to assume it's different on a case by case basis.
Web Design | Oct 12, 2011, 3:58 PM | MichaelWeisbaum0