Disallow a spammed sub-page from robots.txt
-
Hi,
I have a sub-page on my website with a lot of spam links pointing on it. I was wondering if Google will ignore that spam links on my site if i go and hide this page using the robots.txt
Does that will get me out of Google's randar on that page or its useless?
-
Does it rank for anything worthwhile?
Does it have any legitimate / valueable links pointing to it?
If the answer is no to both of those questions, just delete the page and recreate it at a new URL and request a removal of the old URL from Google's index (and obviously don't 301 redirect it).
-
Hi, my personal opinion is that if they were unintentional or not done by you then Google will ignore these and not penalise site (see Rans Whiteboard Friday video on Negative SEO).
However if it is a page that is not very important to you then maybe you should consider removing this page from Googles index (use GWT for this) and then getting Google to re-index a new page that has no spam links pointing to it?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages for similar keywords?
I have a site that wants to target the keywords listed below. They are a small company with just a few ski chalets in Val d'Isere, a ski resort in France. They don't have ski chalets in any other ski resort. Val d'Isere chalets
On-Page Optimization | | Marketing_Today
luxury ski chalets
luxury ski chalets Val d'Isere
catered chalet Val d'Isere
catered chalet val d isere
catered ski chalet val d'isere
catered ski chalets
chalets in Val d'isere
chalets in val d isere
luxury catered ski chalets
luxury ski chalets Their domain name includes "valdisere" but I can't get this site onto Page 1, it keeps lingering on Page 2 and 3. I wondered how you would approach this site with pages? Would you reply on the homepage to rank for all these terms or create seperate pages for the terms, and if so how would you group the terms per page?0 -
Where to position a new page?
Hi there 🙂
On-Page Optimization | | Enrico_Cassinelli
Our website is about a particular region in Italy, the Langhe area, famous for food and wine (barolo and barbaresco are produced here). We need to rollout a few new pages about cellar/winery tours: one main page with the list of tours, and the various subpages for each tour. We already have a page about travel, and a page about wine (with a sub-page about wineries). The URLs looks like:
langhe.net/travel/
langhe.net/wine/wineries/
(Note: i'm translating from italian here) Now, I'm wondering where is better to position the new pages:
langhe.net/travel/winery-tours/name-of-tour/ or
langhe.net/wine/wineries/tours/name-of-tour/ From an SEO perspective (within my limited experience) the first option has a shorter URL, but the second feels more "natural" to me. What do you think? Thanks 🙂
Best0 -
Can Robots.txt on Root Domain override a Robots.txt on a Sub Domain?
We currently have beta sites on sub-domains of our own domain. We have had issues where people forget to change the Robots.txt and these non-relevant beta sites get indexed by search engines (nightmare). We are going to move all of these beta sites to a new domain that we disallow all in the root of the domain. If we put fully configured Robots.txt on these sub-domains (that are ready to go live and open for crawling by the search engines) is there a way for the Robots.txt in the root domain to override the Robots.txt in these sub-domains? Apologies if this is unclear. I know we can handle this relatively easy by changing the Robots.txt in the sub-domain on going live but due to a few instances where people have forgotten I want to reduce the chance of human error! Cheers, Dave.
On-Page Optimization | | davelane.verve0 -
Duplicate Page Content
Hey Moz Community, Newbie here. On my second week of Moz and I love it but have a couple questions regarding crawl errors. I have two questions: 1. I have a few pages with duplicate content but it say 0 duplicate URL's. How do I know what is duplicated in this instance? 2. I'm not sure if anyone here is familiar with an IDX for a real estate website. But I have this setup on my site and it seems as though all the links it generates for different homes for sale show up as duplicate pages. For instance, http://www.handyrealtysa.com/idx/mls...tonio_tx_78258 is listed as having duplicate page content compared with 7 duplicate URLS: http://www.handyrealtysa.com/idx/mls...tonio_tx_78247
On-Page Optimization | | HandyRealtySA
http://www.handyrealtysa.com/idx/mls...tonio_tx_78253
http://www.handyrealtysa.com/idx/mls...tonio_tx_78245
http://www.handyrealtysa.com/idx/mls...tonio_tx_78261
http://www.handyrealtysa.com/idx/mls...tonio_tx_78258
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260
http://www.handyrealtysa.com/idx/mls...tonio_tx_78260 I've attached a screenshot that shows 2 of the pages that state duplicate page content but have 0 duplicate URLs. Also you can see somewhat about the idx duplicate pages. rel="canonical" is functioning on these pages, or so it seems when I view the source code from the page. Any help is greatly appreciated. skitch.png0 -
Local Service Pages
We've all been here before if you do local. What type of content should go on a local service page when dealing with multiple service locations? You could: Describe Services List Local News Articles List staff in that location (although I would prefer in the staff page for that city) Testimonials from that location or service But what happens when you are describing something that needs no explanation. Or a medical procedure that requires no localization and altering the wording can actually cause legal problems if misstated. Matt Cuts recommends a few sentences to a paragraph to describe a service, but my experience hasn't found this to hold up locally. Any ideas or suggestions about how this could be remedied?
On-Page Optimization | | allenrocks0 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
Product Page Optimization
I work for an ecommerce site and we are currently in the process of redesigning our product page. Any useful, must-do tips for this? If it helps, our site has both hard goods and apparel that can be imprinted and customized to the buyers liking. Thanks for any help!
On-Page Optimization | | ClaytonKendall1 -
On Page Optimization Reports
How is it determined which terms and associated urls are chosen when SEOmoz tracks your On-Page Report Card? I'm receiving a lot of F Grades for terms I'm not really interested in and a lot of terms I'd like to be tracked aren't. Is there a way I can manually choose which terms and pages I'd like to be shown?
On-Page Optimization | | ClaytonKendall0