Login webpage blocked by robots
-
Hi, the SEOMOZ crawl diagnostics shows that this page:
www.tarifakitesurfcamp.com/wp-login.php is blocked (noindex, nofollow)
Is there any problem with that?
-
thanks!
-
thanks!
-
Unless you have relevant information for your users on the log in page (i.e. for your private use) then it's probably a good idea not to index it!
-
Nope, that's perfectly fine since that's your login page for Wordpress.
If you're linking to the page from anywhere on your site (which you really shouldn't be), you could update the meta robots tag to (noindex, FOLLOW), but since it looks like the page has no links, it shouldn't be necessary.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is robots.txt file issue?
I hope you are well. Mostly moz send me a notification that your website can,t be crawled and it says me o check robots.txt file. Now the Question is how can solve this problem and what should I write in robots.txt file? Here is my website. https://www.myqurantutor.com/ need your help brohers.... and Thanks in advance
On-Page Optimization | | matee.usman0 -
Are there detrimental effects of having multiple robot tags
Hi All, I came across some pages on our site that have multiple robot tags, but they have the same directives. Two are identical while one is for Google only. I know there aren't any real benefits from having it set up this way, but are there any detrimental effects such as slowing down the bots crawling these pages? name="googlebot" content="index, follow, noodp"/> Thanks!
On-Page Optimization | | STP_SEO0 -
When You Add a Robots.txt file to a website to block certain URLs, do they disappear from Google's index?
I have seen several websites recently that have have far too many webpages indexed by Google, because for each blog post they publish, Google might index the following: www.mywebsite.com/blog/title-of-post www.mywebsite.com/blog/tag/tag1 www.mywebsite.com/blog/tag/tag2 www.mywebsite.com/blog/category/categoryA etc My question is: if you add a robots.txt file that tells Google NOT to index pages in the "tag" and "category" folder, does that mean that the previously indexed pages will eventually disappear from Google's index? Or does it just mean that newly created pages won't get added to the index? Or does it mean nothing at all? thanks for any insight!
On-Page Optimization | | williammarlow0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
One Webpage per Topic or splitting up for better reading...?
What is better from a SEO-Point of View: I am building right now a website where the principal topic is Renewable Energies. There will be a menu listing all kinds of Energy-types: Biogas CSP Biomass etc. And now my question: Each Topic has about 800-1000 Words of unique content with sub-topics. I think its certainly good to have for each energy type one separate page. But I think its no a good Idea to split also the subtopics up to further sub-pages like: www.energy.com/renewable-energies-biomass.html www.energy.com/renewable-energies-biomass-eficiency.html www.energy.com/renewable-energies-biomass-market.html www.energy.com/renewable-energies-biomass-industries.html as 1000 Words on one page may look like better higher quality content than making 3-4 pages with just 200 Words... talking about Biomass, but from several points of views. So I think its better to put all about Biomass on one single-page and use a menu just to jump to the subtopics via anchor-tags. Right? 🙂 Thanks Kate and Charles! Meanwhile I found out whats the right term for my question: "Pagination" I read about using the rel="next" and rel="prev" attribute when paginating an article over different pages.
On-Page Optimization | | inlinear
MY DOUBT: Sometimes you see single page paginated by using javaScript that hides text although all is in the page source, for better reading. Does Google like that or might think it could be hidden text with spamming purpose? So I think using old school "named anchors" to divide text into topics (for text about 1000 words) is better than using javaScript that reaveals text via pagination or expand collapse.0 -
Robots.txt file
Does it serve any purpose if we omit robots.txt file ? I wonder if spider has to read all the pages, why do we insert robots.txt file ?
On-Page Optimization | | seoug_20050 -
Photogallery and Robots.txt
Hey everyone SEOMOZ is telling us that there are to many onpage links on the following page: http://www.surfcampinportugal.com/photos-of-the-camp/ Should we stop it from being indexed via Robots.txt? best regards and thanks in advance... Simon
On-Page Optimization | | Rapturecamps0 -
How can I reduce my webpage load time?
According to google 'On average, pages in your site take 4.6 seconds to load (updated on Apr 3, 2011). This is slower than 71% of sites. These estimates are of low accuracy (fewer than 100 data points). The chart below shows how your site's average page load time has changed over the last few months. For your reference, it also shows the 20th percentile value across all sites, separating slow and fast load times.' My website: http://ablemagazine.co.uk I've installed Cache plugins, Minify plugins, reduce the amount of posts on my main page. But my website is still taking too long to load and I'm afraid I'm being penalised for it. Any tips?
On-Page Optimization | | craven220