How to fix these unwanted URLs?
-
Right now i have wordpress, one page website, but google also show wp-content. KIndly check below in google.
site:http://baltimoreelite.com/
How I can fix this issue?
-
Great job, Mark! I can see from this end that nearly all of those unwanted URLs have already dropped out of the results. That's far quicker than even I expected! And the ones that aren't gone are leading to a 403 Forbidden page, which is great.
One last thing you can do if you want. Because you are on HostGator, they are displaying their custom 403 error page, which has their branding all over it (nasty, kinda ugly) You could create your own simple 403 error page, add your own basic branding to it, and for instance add a line that says something like "You don't have permission to view this page or it is blocked for security reasons. Drop by the home page [link to home page] to find what you're looking for, or to conduct a search."
This basic page can be used to replace the one that HostGator provides by default so any visitors that hit it by accident will still feel like they are on your site, and will have a suggestion for what to do next. Your hosting control panel will have instructions for how & where to provide your own custom error pages.
Hope that last little tweak's useful.
Paul
-
Thank You Paul. All is well now.
-
If I were you, Mark, I'd add it right at the top of your htaccess file. I'd also add in a descriptive comment to make the reason for the directive clear. So:
BEGIN Remove ability to read directory indexes
Options -Indexes
END Remove ability to read directory indexes
These lines would be inserted right at the top of the htaccess file. I would also warn though, that I've had situations where caching plugins have overwritten such directives when they update the htaccess themselves. If that happens, you man need to try inserting it after #END WPSuperCache and before # BEGIN WordPress.
Hope that works for you?
Paul
-
Andy and Nishada - don't forget... Adding robots.txt disallows will do nothing to get already indexed URLs out of the search index after the fact.
Paul
-
You have a much bigger problem than what can be solved just with a robots.txt file, Mark.
All of those URLs are showing up because of a misconfiguration of your theme installation (likely caused by the theme developer) is allowing full display of all of the content of each of those directories. In addition to polluting your search results, as you've noticed, it's a also a pretty major security risk. You can see this in action by going to http://baltimoreelite.com/wp-content/themes/sintia/wpv_theme/assets/css/ What should happen when you go to that URL is you see a blank page, or receive a 403-Forbidden warning. Instead, you're seeing a full listing of the directory contents - bad news.
Since I don't know your hosting configuration, the easiest way to fix this issue is to add a line to your .htaccess file at the root of your site. This should correct for all such instances, You need to add this line:
<code>Options -Indexes</code>
If you're not familiar, the .htaccess file is a text file which you can edit with any text editor. You'll need to use an FTP program or the file manager in your hosting control panel to access it at the root of your site. (You may also need to enable "show hidden files" in your program). I always recommend backing up the existing file before editing just in case. You'll add this new line on its own line with a blank line between it and any other lines in the file. It can go near or at the top of your htaccess file.
Now getting those URLs out of the search index is going to take a bit more work. You'll want to implement the robots.txt exclusions like Andy suggested, and then you'll need to go into Google Webmaster Tools and use the Remove URLs tool to specifically request removal of the directories you blocked with the robots.txt and that you also want removed from the search results. (The robots.txt is a critical part of this process, as Google requires it be in place in order to process the removal requests.)
This, combined with the htaccess edit mentioned above, should keep those URLs from showing up again in teh future.
Hope that all makes sense. If not, be sure to ask!
Paul
-
Hi Mark,
I block the same on my site (which is also a single page). Here is the content of my Robots.txt file.
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/ Sitemap: http://www.inetseo.co.uk/sitemap.xml.gz
-Andy
-
Hi Mark,
You need to tell crawlers to not to index those content by modifying the robots.txt file. Below is a good link with some examples and instructions
http://stackoverflow.com/questions/17029811/how-to-set-up-robots-txt-file-for-wordpress
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Folders in url structure?
Hello, Revamping an out-of-date website and am wondering if I need to include the folders (categories) in the url structure? The proposed structure has 8 main folders. I've been reading that Google is ok if the folder is not included in the url, but is it really? The hesitation I have is that the urls are getting long and the main folder only has only a sub folder beneath it. So, /folder-name/facility-name/treatment-overview. This looks too long, doesn't it? Thanks!
Technical SEO | | lfrazer1230 -
Same URL for languages sub-directories
Hi All, I have a main domain and 9 different subdirectories for languages, example: www.example.com/page.html www.example.com/uk/page-uk.html www.example.com/es/page-es.html we are implementing hreflang tags for the languages, but we are thinking to get rid of the dashes on the languages URL: -uk or -es, so it will be: www.example.com/page.html www.example.com/uk/page.html www.example.com/es/page.hrml would this be a problem? to have same page names even if they are in different subdirectories? would we need to add canonical tags, at lease for the main domain URLs? www.kornferry.com/page.html Thank you, Rachel
Technical SEO | | RaquelSaiz0 -
How do I customize Magento product urls?
I would like my product urls to be /category/manufacturer/name/part#. This would be the only url the item uses and how the product is accessed. It would also be used for product feeds. My first attempt was to use https://amasty.com/magento-unique-product-url.html This creates a single url but I can not customize it. Sometimes it selects the manufacturer and sometimes the category. My second attempt was with https://www.magentocommerce.com/magento-connect/custom-product-urls-seo.html I have it installed but it doesn't change the urls. Has anyone been able to do this successfully?
Technical SEO | | Tylerj0 -
Question about creating friendly URLs
I am working on creating new SEO friendly URLs for my company website. The products are the items with the highest search volume and each is very geo-specific
Technical SEO | | theLotter
There is not a high search volume for the geo-location associated with the product, but the searches we do get convert well. Do you think it is preferable to leave the location out of the URL or include it?0 -
Anyone using Adobe Business Catalyst and Fixing SEO URL Blog Updates?
Does anyone else have experience with the current update Adobe Business Catalyst has announced for their blog features? Florin at BC offered the code below: http://www.graeagle.com/images/fb_blog_og_img.jpg" /> However nether myself nor another commentator can figure out how to make it work: I added the meta data to my template but it seems the tags are not correct. For example, the tag {tag_blogpostmetatitle} does not automatically include the SEO title that I've called out in my individual blog post. So, it appears the browser is ignoring the tag and just including it as is. When I view the source for my live blog article, this is what I get for the lines that I've added the code in the tag: Also, I cannot get schema metadata to work on the BC blog. For example, I have used it on this page: http://www.homedestination.com/_blog/Real_Estate_Blog/post/things_to_know_before_building_a_new_home/; which yields the following in Google's Rich Snippet Tool: Extracted structured data rdfa-node property: title: {tag_blogpostmetatitle} description:__{tag_blogpostmetadescription}
Technical SEO | | jessential0 -
Do keywords in url parameter count?
I have a client who is on an older ecommerce platform that does not allow url rewrites in anyway. It would cost a ton of money to custom dev a solution. Anyways right now they have set up a parameter on their product urls to at least get the keyword in there. My question is, will this keyword actually be counted since it is in a parameter? An example url is http://domain.com/Catalog.aspx?Level1=01&Level2=02&C=Product-name-here Does this 'product-name-here' count as having the keyword in the url according to google?
Technical SEO | | webfeatseo0 -
URL content format - Any impact on SEO
I understand that there is a suggested maximum length for a URL so as not to be penalized by search engines. I'm wondering if I should if should optimize our ecommerce categories to be descriptive or use abbreviations to help keep the URL length to a minimum? Our products are segmented into many categories, so many products URL's are pretty long if we go the descriptive route. I've also heard that removing the category component entirely from a product URL can also be considered. I'm fairly new to all this SEO stuff, so I'm hoping the community can share their knowledge on the impact of these options. Cheers, Steve
Technical SEO | | SteveMaguire0 -
Not sure which URL to use for 301 redirect
A client has new website design completed by another developer, was launched in April of this year. No 301 redirect was set up so duplicate content is an issue. Client has had a website with same domain name for about 10 years, but has not had any SEO work completed before or since his new site design. For non-www there are 6 referring links - 1 considered to have authority, for www there are also 6 but 3 considered to have authority. More links seem to coming from www than non-www. But for one of the clients keywords they are ranked #1 for their area and that links to their non-www address. And even though no redirects set up by developer, non-www has had far more visits according to Google Analytics. So many basics that still need to be done for site: no meta-descriptions on any page, H1 and page titles could use keywords, call to action moved above fold, etc. Considering this is a new site, and new SEO work and many more inbound links needed, does it matter which address I redirect to? _Cindy Barnard
Technical SEO | | CeCeBar0