Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
I am Using <noscript>in All Webpage and google not Crawl my site automatically any solution</noscript>
-
| |
| | <noscript></span></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><meta http-equiv="refresh" content="0;url=errorPages/content-blocked.jsp?reason=js"></td> </tr> <tr> <td class="line-number"> </td> <td class="line-content"><span class="html-tag"></noscript> |and Please tell me effect on seo or not
-
Also, some more information I can gather from your question:
- that noscript is telling non-js users/bots to meta refresh to an error page
- Google shouldn't be confused by that, but Screaming Frog would (and potentially other search engines)
- it is probably also not the best experience for non-js users: You can display an error messages without redirecting to another URL.

Hope that's helpful...
-
Thanks for the question!
It sounds like you are concerned about Google being able to crawl your site, and you think the
<noscript>tag on every page might be the cause? In your example it looks like if someone tries to access your page with JavaScript disabled they would be redirected to an error page? </p> <p>Anyway you can share your domain so I can better assist?</p> <p>Thanks!</p></noscript>
-
Manual index
-
I got your site in your PM. I went to google and typed site:yourdomain.com and saw that Google reports over 400 pages from your site are indexed.
-
I send site to private message
-
Can you share your site?
-
I have content all page but google can't crawl my site i check on frog crawl but can't find any page
-
So there is no content between the noscript tags?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My news site not showing in "In the news" list on Google Web Search
I got a news website (www.tapscape.com) which is 6 years old and has been on Google News since 2012. However, whenever I publish a news article, it never shows up "In the news" list on Google Web Search. I have already added the schema.org/NewsArticle on the website and have checked it if it's working or not on Google structured data testing tool. I see everything shows on on the structured data testing tool. The site already has a news sitemap (http://www.tapscape.com/news-sitemap.xml) and has been added to Google webmaster tools. News articles show perfectly fine in the News tab, but why isn't the articles being shown on "In the news" list on the Google web search? My site has a strong backlink background already, so I don't think I need to work on the backlinks. Please let me know what I'm doing wrong, and how can I get it to the news articles on "In the news" list. Below is a screenshot that I have attached to this question to help you understand what I mean to say. 1qoArRs
Web Design | | hakhan2010 -
Hiding content until user scrolls - Will Google penalize me?
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections. I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it). Is this still the case? Thanks, Sam
Web Design | | Sam.at.Moz0 -
Interlinking using Dynamic URLs Versus Static URLs
Hi Guys, Could you kindly help us in choosing best approach out of mentioned below 2 cases. Case. 1 -We are using: We interlink our static pages(www.abc.com/jobs-in-chennai) through footer, navigation & by showing related searches. Self referential Canonical tags have been implemented. Case. 2 -We plan to use: We interlink our Dynamic pages(www.abc.com/jobs-in-chennai?source=footer) through footer, navigation & by showing related searches. Canonical tags have been implemented on dynamic urls pointing to corresponding static urls Query 1. Which one is better & expected to improve rankings. Query 2. Will shifting to Case 2 negatively affect our existing rankings or traffic. Regards
Web Design | | vivekrathore0 -
Privacy Policy: index it/? And where to place it?
Hi Everyone, Two questions, first: should you allow google to index your privacy policy? Second: for a service based site (not e-commerce, not selling anything) should you put the policy in the footer so it's site wide or just on the "contact us" form page? Best, Ruben
Web Design | | KempRugeLawGroup0 -
Anyone using CloudFlare on multiple sites?
We are considering using CloudFlare as a CDN for a large group of sites. The fees are $5 to $200 depending on many factors. We tried the free trial on one site and were impressed with the results. I am wondering if any of you have any longer term experience with this and performance metrics, etc.
Web Design | | RobertFisher1 -
What's the point of an EU site?
Buongiorno from 18 degrees C Wetherby UK 🙂 On this site http://www.milwaukeetool.eu/ the client wants to hold on to the EU site despite there being multiple standalone country sittes e.g. http://www.milwaukeetool.fr & http://www.milwaukeetool.co.uk Why would you ever need an EU site? I mean who ever searches for an EU site? If the client holds on to the eu site despite my position it's a waiste of time from a search perspective is the folowing the best appeasment? When a user enters the eu url or redirects to country the detected, eg I'm in Paris I enter www.milwaukeetool.eu it redirects to http://www.milwaukeetool.fr. My felling this would be the most pragmatic thing to do? Any ideas please,
Web Design | | Nightwing
Cioa,
David0 -
Google also indexed trailing slash version - PLEASE HELP
Hi Guys, We redesigned the website and somehow our canonical extension decided to add a trailing slash to all URLs. Previously our canonical URLs didn't have a trailing slash. During the redesign we haven't changed the URLs. They remained same but we have now two versions indexed. One with trailing slash one without. I've now fixed the issue and removed the the trailing slash from canonical URLs. Is this the correct way of fixing it? Will our rankings be effected in a negative way? Is there anything else I need to do. The website went live last Tuesday. Thanks
Web Design | | Jvalops0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0