Disallow a spammed sub-page from robots.txt
-
Hi,
I have a sub-page on my website with a lot of spam links pointing on it. I was wondering if Google will ignore that spam links on my site if i go and hide this page using the robots.txt
Does that will get me out of Google's randar on that page or its useless?
-
Does it rank for anything worthwhile?
Does it have any legitimate / valueable links pointing to it?
If the answer is no to both of those questions, just delete the page and recreate it at a new URL and request a removal of the old URL from Google's index (and obviously don't 301 redirect it).
-
Hi, my personal opinion is that if they were unintentional or not done by you then Google will ignore these and not penalise site (see Rans Whiteboard Friday video on Negative SEO).
However if it is a page that is not very important to you then maybe you should consider removing this page from Googles index (use GWT for this) and then getting Google to re-index a new page that has no spam links pointing to it?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we rename and update a page or create a new page entirely?
Hi Moz Peoples! We have a small site with a simple site navigation, with only a few links on the nav bar. We have been doing some work to create a new page, which will eventually replace one of the links on the nav bar. The question we are having is, is it better to rename the existing page and replace its content and then wait for the great indexer to do its thing, or perm delete the page and replace it with the new page and content? Or is this a case where it really makes no difference as long as the redirects are set up correctly?
On-Page Optimization | | Parker8180 -
Unique Pages with Thin Content vs. One Page with Lots of Content
Is there anyone who can give me a definitive answer on which of the following situations is preferable from an SEO standpoint for the services section of a website? 1. Many unique and targeted service pages with the primary keyword in the URL, Title tag and H1 - but with the tradeoff of having thin content on the page (i.e. 100 words of content or less). 2. One large service page listing all services in the content. Primary keyword for URL, title tag and H1 would be something like "(company name) services" and each service would be in the H2 title. In this case, there is lots of content on the page. Yes, the ideal situation would be to beef up content for each unique pages, but we have found that this isn't always an option based on the amount of time a client has dedicated to a project.
On-Page Optimization | | RCDesign741 -
123 keywords for a page
Hey mOz fans , I have a site that has 130 keywords. can ı target this amount just incoperate them as Ryan discussed Before.
On-Page Optimization | | atakala0 -
Pages or Blog posts?
Hi, I am currently building content for a customer's website. There are approximately 50 new content pages I am building about the business, products they serve, how-tos and tips and advice. The website is built on Wordpress so my question is would it be best to post this content as a different blog posts or as separate pages in Wordpress and link them up to a 'hub page' as mentioned on this post about How to rank (point 16) Thanks for any advice.
On-Page Optimization | | btiffin0 -
Pages exclusion
Hi there! I don't want to "make relevan"t 3 of the pages my website has. Should I give them a title and a description even if I dont't want them to be shown in the SERPs? Is there any penalty from google if I don't do so? I guess no but just to confirm. Thanks!
On-Page Optimization | | juanmiguelcr0 -
How do i block an entire category/directory with robots.txt?
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now. The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc. I'm not really sure how i'd type it into the robots.txt file, or where to place the file. any help would be appreciated thanks
On-Page Optimization | | bricerhodes0 -
Duplicate Page Titles
I have over 200 duplicate page titles on a site that I am working on. Does putting a date at the end of some of them make it a unique enough title?
On-Page Optimization | | SavingSense0