But google is still free to index a link/page even if it is not included in xml sitemap.
Posts made by sjunaidali
-
RE: Robots.txt: how to exclude sub-directories correctly?
-
RE: Robots.txt: how to exclude sub-directories correctly?
I am using wordpress, Enfold theme (themeforest).
I want some files to be accessed by google, but those should not be indexed.
Here is an example: http://prntscr.com/h8918o
I have currently blocked some JS directories/files using robots.txt (check screenshot)
But due to this I am not able to pass Mobile Friendly Test on Google: http://prntscr.com/h8925z (check screenshot)
Is its possible to allow access, but use a tag like noindex in the robots.txt file. Or is there any other way out.