How to fix these unwanted URLs?
-
Right now i have wordpress, one page website, but google also show wp-content. KIndly check below in google.
site:http://baltimoreelite.com/
How I can fix this issue?
-
Great job, Mark! I can see from this end that nearly all of those unwanted URLs have already dropped out of the results. That's far quicker than even I expected! And the ones that aren't gone are leading to a 403 Forbidden page, which is great.
One last thing you can do if you want. Because you are on HostGator, they are displaying their custom 403 error page, which has their branding all over it (nasty, kinda ugly) You could create your own simple 403 error page, add your own basic branding to it, and for instance add a line that says something like "You don't have permission to view this page or it is blocked for security reasons. Drop by the home page [link to home page] to find what you're looking for, or to conduct a search."
This basic page can be used to replace the one that HostGator provides by default so any visitors that hit it by accident will still feel like they are on your site, and will have a suggestion for what to do next. Your hosting control panel will have instructions for how & where to provide your own custom error pages.
Hope that last little tweak's useful.
Paul
-
Thank You Paul. All is well now.
-
If I were you, Mark, I'd add it right at the top of your htaccess file. I'd also add in a descriptive comment to make the reason for the directive clear. So:
BEGIN Remove ability to read directory indexes
Options -Indexes
END Remove ability to read directory indexes
These lines would be inserted right at the top of the htaccess file. I would also warn though, that I've had situations where caching plugins have overwritten such directives when they update the htaccess themselves. If that happens, you man need to try inserting it after #END WPSuperCache and before # BEGIN WordPress.
Hope that works for you?
Paul
-
Andy and Nishada - don't forget... Adding robots.txt disallows will do nothing to get already indexed URLs out of the search index after the fact.
Paul
-
You have a much bigger problem than what can be solved just with a robots.txt file, Mark.
All of those URLs are showing up because of a misconfiguration of your theme installation (likely caused by the theme developer) is allowing full display of all of the content of each of those directories. In addition to polluting your search results, as you've noticed, it's a also a pretty major security risk. You can see this in action by going to http://baltimoreelite.com/wp-content/themes/sintia/wpv_theme/assets/css/ What should happen when you go to that URL is you see a blank page, or receive a 403-Forbidden warning. Instead, you're seeing a full listing of the directory contents - bad news.
Since I don't know your hosting configuration, the easiest way to fix this issue is to add a line to your .htaccess file at the root of your site. This should correct for all such instances, You need to add this line:
<code>Options -Indexes</code>
If you're not familiar, the .htaccess file is a text file which you can edit with any text editor. You'll need to use an FTP program or the file manager in your hosting control panel to access it at the root of your site. (You may also need to enable "show hidden files" in your program). I always recommend backing up the existing file before editing just in case. You'll add this new line on its own line with a blank line between it and any other lines in the file. It can go near or at the top of your htaccess file.
Now getting those URLs out of the search index is going to take a bit more work. You'll want to implement the robots.txt exclusions like Andy suggested, and then you'll need to go into Google Webmaster Tools and use the Remove URLs tool to specifically request removal of the directories you blocked with the robots.txt and that you also want removed from the search results. (The robots.txt is a critical part of this process, as Google requires it be in place in order to process the removal requests.)
This, combined with the htaccess edit mentioned above, should keep those URLs from showing up again in teh future.
Hope that all makes sense. If not, be sure to ask!
Paul
-
Hi Mark,
I block the same on my site (which is also a single page). Here is the content of my Robots.txt file.
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/ Sitemap: http://www.inetseo.co.uk/sitemap.xml.gz
-Andy
-
Hi Mark,
You need to tell crawlers to not to index those content by modifying the robots.txt file. Below is a good link with some examples and instructions
http://stackoverflow.com/questions/17029811/how-to-set-up-robots-txt-file-for-wordpress
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International URL Structures
Hi everyone! I've read a bunch of articles on the topic, but I can't seem to be able to figure out a solution that works for the specific case. We are creating a site for a service agency, this agency has offices around the world - the site has a global version (in English/French & Spanish) and some country specific versions. Here is where it gets tricky: in some countries, each office has a different version of the site and since we have Canada for example we have a French and an English version of the site. For cost and maintenance reason, we want to have a single domain : www.example.com We want to be able to indicate via Search Console that each subdomain is attached to a different country, but how should we go about it. I've seen some examples with subfolders like this: Global FR : www.example.com/fr-GL Canada FR: www.example.com/fr-ca France: www.example.com/fr-fr Does this work? It seems to make more sense to use : **Subdirectories with gTLDs, **but I'm not sure how that would work to indicate the difference between my French Global version vs. France site. Global FR : www.example.com/fr France : www.example.com/fr/fr Am I going about this the right way, I feel the more I dig into the issue, the less it seems there is a good solution available to indicate to Google which version of my site is geo-targeted to each country. Thanks in advance!
Technical SEO | | sarahcoutu150 -
New URL Structure
Hi Guy's, For our webshop we're considering a new URL structure because longtail keywords to rank so well. Now we have /category (main focus keywords)
Technical SEO | | Happy-SEO
/product/the-product345897345123/ (nice to rank on, not that much volume) We have over 500 categories and every one of them is placed after our domain. Because i think it's better to work with a good structure and managed a way to make categories and sub-categories. The 500 categories may be the case why not every one of them is ranking so well, so that was also the choice of thinking about a new structure. So the new URL structure will be: /category (main focus keywords)
/category/subcat/ (also main focus keywords) Everything will be redirect (301, good way), so i think there won't be to much problems. I'm thinking about what to do with the /product/ URL. Because now it will be on the same level as the subcategories, and i'm affraid that when it's on that level, Google will give the same value to both of them. My options that i'm considering are: **Old way **
/product/the-product-345897345123/ .html (seen this on big webshops)
/product/the-product-345897345123.html/ Level deeper SKU /product/the-product/345897345123/ What would you suggest? The new structure would be 20 categories 500+ sub's devided under main categories 5000+ products Thanks!0 -
New website on new url?
We have a new website on a new url (been up for around 2 years now) and our old website is slowly fading in the background, we are now at the point where the money is still ok but we are having issues running both side by side, we have a calculator on each page and are thinking about removing this and adding a box with please order from our new site here (with url of similar page). Now the issue is we don't want to link for SEO purposes and google hammer us (thinking of no - following these) and we also have a penalty we got in 2012 on the site but we did get out of this, would this cause any issue to the new site?
Technical SEO | | BobAnderson1 -
Shortening URL's
Hello again Mozzers, I am debating what could be a fairly drastic change to the company website and I would appreciate your thoughts. The URL structure is currently as follows Product Pages
Technical SEO | | ATP
www.url.co.uk/product.html Category Pages
www.url.co.uk/products/category/subcategory.html I am debating removing the /products/ section as i feel it doesn't really add much and lengthens the url with a pointless word. This does mean however redirecting about 50-60 pages on the website, is this worth it? Would it do more damage than good? Am i just being a bit OCD and it wont really have an impact? As always, thanks for the input0 -
Spaces (actual spaces) in URL
Hi all, Is there a huge loss of SEO performance if a URL shows spaces with an actual space (i.e. %20) in the URL rather than a "-" (or indeed a "_")? I know the preferred option is to have a "-", but I am just wondering if it is worth our effort to manually change the "%20" to a "-" in all the instances? Thanks 🙂 Diana
Technical SEO | | Diana.varbanescu0 -
URL or sitemap submit to search engines?
Hello, I have just updated content at some URL site links, and I also added new URL content. Should I submit URL or re-create a sitemap then submit it to search engines? And please advise me some tools for submit them?
Technical SEO | | JohnHuynh0 -
Singular vs plural in urls
In keyword research for an ecommerce site, I've found that widget, singular gets a lot more searches than widgets, plural AND is much less competitive. Is it better for SEO purposes to have the URLs (and matching title tags) in the catalog as /brass-widget.html, /steel-widget.html, etc., or /brass-widgets.html, etc.? I'm worried that a) searches for widgets will pass by the singular urls but not vice versa, and b) the singular form will strike visitors as bad grammar. Any advice?
Technical SEO | | AmericanOutlets0 -
Should I repeat keywords at each folder level of my urls?
Hey Guys, I'm wondering which URL is preferable when targeting the keyword phrase "ski goggles" a) http://www.evo.com/shop/ski/ski-goggles.aspx or b) http://www.evo.com/shop/ski/goggles.aspx URL a includes the keyword phrase exactly with a dash but also repeats the word "ski" and feels redundant. Any research/ testing to support either case? Thanks a bunch. Will
Technical SEO | | evoNick0