How to go about removing bad/irrelevent links?
-
We have been made aware of a series of irrelevant links on some rather dodgy sites.
- http://www.designerdogstop.com/level-static/http://www.bestfirepits.net/some-of-the-best-vacations-for-families/2010/05/05/
- http://whatcigarsdoismoke.com/cigar-lighters/cigarette-cigar-2/
- http://dollfuss.org/build-bear-hawaii/
Absolute rubbish im sure you will agree. These links must surely be causing our link profile some damage.
They are currently wordpress sites with no means of contacting the authors. What ways and means are there of removing these negative pages and links?
Cheers all, any help appreciated.
-
Thanks Daniel,
Lots of additional features & improvements still in the works - updates @rmoov
Sha
-
Nice! Checking it out right now. Good luck with the project ! ~^DH
-
Hi Tim,
We just opened free Beta during this last weekend on a link removal management service that we developed.
The free trial will allow you to run a campaign including the domains you mentioned in your post.
The tool allows you to drop in a list of URLs, pull contact information from ICANN (the whois data mentioned above) for each of the domains, customize emails, send follow-ups, receive notification of cleanups from webmasters etc.
Hope it helps,
Sha
-
There's no quick route for SEO. Work slow, and do your work properly. I think that ipositions' suggesting is by far the best one - look-up contact details, contact them and request to remove the link. Patience helps
-
I do agree that not spending too much time is critical, as it can become hugely time consuming and inefficient. But getting rid of a few is also a help.
Being so time consuming was the main reason in my initial post, to see if there were any quicker options available to web masters, e.g. sending google a list of links that you wish for it to disregard due to not being generated by yourself... unlikely I know... :o(
Thanks for the response.
-
Have you looked for the email address of the webmaster using whois.net? That's the first thing we do when contact details are not provided on the site.
I've seen a lot of people here going on about removing bad links and where possible I agree that it should be done. However a line must be drawn where getting a link removed becomes too time consuming to be worth the effort. That time would be better spent creating content, reaching out to popular relevant blogs etc and create enough good links that offset the bad ones.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google penalty removal expert questions
We have searched online for a Google penalty “expert” (individual or company) and have located what appear to be “experts”. Please provide feedback on the following 2 individuals/companies we have found that can help with penalty removal. Have you or one of your clients used either of the “experts” below? What were the results? How many disavows and reconsideration requests did you/they have to make? 1.www.penaltypros.com . To give a quote and to see what your links are they use links from Google Webmaster Tools only. Penaltypros.com disavows first and then removes bad links second. This is opposite of what Google and Seo’s recommend but penaltypros.com claims 100% success using this non-traditional approach. See imgur.com link for screenshot. 2.http://www.hiswebmarketing.com/ To give a quote and to see what your links are they use links from https://ahrefs.com/ only. Please provide any and all feedback on the above 2 “experts” and also post the websites, individual names, company names of those that you consider Google penalty removal “experts” so that we may obtain a quote from them. Lp9F3FI
Industry News | | RetractableAwnings.com1 -
Are large property portals going to continue to dominate Google's search results?
We are having a discussion (potential argument) in our office around whether large portals (namely property portals) have longevity in Google's search. Argument 1:
Industry News | | NeilPursey
Google's rise in local search and rewarding strong brand names rather than keyword driven domain names will devalue property portals with keyword rich domain names. Property portals are essentially duplicating content on smaller individually owned property websites, therefore in time Google will devalue property portals. **Argument 2: **
Property portals have more property stock listed on their websites so therefore Google will reward them by ranking these websites higher than the smaller real estate agencies with niche stock in their areas that they operate in. The property portals that already are already in a dominate position already carry authority and their own sense of branding, therefore it's difficult for Google to ignore them. If we assume that Google is looking into user behaviour as a ranking factor, then this will help portals as they have more stock which means higher engagement on the website. I'd love to read the moz community thoughts and opinions on this. I reckon it's a worthy debate..0 -
Are you affected by the Gov't shutdown or is it just your .gov links?
With the shutdown came the take down of sites such as http://www.usda.gov/ and even: http://nsa.gov/ (even though http://www.downforeveryoneorjustme.com/nsa.gov says its up UPDATE: now down). Those .gov links might not be worth as much (pun somewhat intended) But here comes an actual question as I was thinking about this, I am really curious... Did your SEO efforts suffer in anyway due to the government shutdown, or is it too early to tell yet? PS Isn't it also interesting that Google's homepage is choosing to celebrate Yosemite's National Park 123rd anniversary when all National Parks are to be closed in our nation? Tfe85nN.jpg
Industry News | | vmialik0 -
Bing beats google to disavow links
You can now disavow bad links in Bing WMT, google has stated they will be doing the same, this should shake up the rankings when many sites get penalties lifted http://www.bing.com/community/site_blogs/b/webmaster/archive/2012/06/27/disavow-links-you-don-t-trust.aspx
Industry News | | AlanMosley1 -
Google Webspam Algo Update 24/4/12
Having just checked our clients rankings 95% have not been affected, in fact many have moved up rankings. 1 or 2 have had big drops 😞 Who has been effected by this? The forums are full of people talking about sites being floored from the serp's. it will be interesting to follow the aftermath of this and get some insight into what exactly has changed!
Industry News | | ifluidmedia0 -
Hyphenated domains – good or bad?
I have a client who will be changing their domain name from www.brand.co.uk to www.brandglobal.com. There are concerns however that this might affect the searches for brand terms and variants. Hence, there is a suggestion for going for www.brand-global.com. I’m aware that search engines are smart enough to discern the brand name in most cases, but I’m wary if it would impact highly on the brand search terms and would it be a better option to go for www.brand-global.com instead of www.brandglobal.com? Any advice (or reference to discussions around this) would be much appreciated. Thanks
Industry News | | ravisodha0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
What is the best SEO Seminar/Training option in the US?
I have 3 years of SEO experience and am an SEO Manager. I am well versed, though much room for growth and education. I want to find an SEO Seminar or training program that best suits my needs. It seems most are a one size fits all seminars. Any recommendations, thoughts or ideas?
Industry News | | dkamen0