Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Please let me know how to improve this email backlink request
-
Hello,
How can I improve upon this email request:
Your "Links" section contains a lot of good websites, and we would like our site to be added to the list.
Our pagerank 4 website, which carries (Here I said what we carry) You have similar sites located in the "Other" Section on your link page. We would greatly appreciate being added to this list.
Sincerely,
BobW
Webmaster
Our Site Name Here
Email Address Here
Phone Number Here -
No worries - Glad I could help out.
-
Derek,
I meant to click on "Good Answer" for your answer. You really helped. I apologize, and I will click yours first next time.
-
Try this guide. Some templates here:
http://www.seomoz.org/blog/broken-link-building-guide-from-noob-to-novice
Customize and tailor each email as much as possible. Really look at the target site, follow them on twitter, learn about them, etc
For example, I recently started following a target. I was going to do a broken link email, but soon enough, they were ranting in a blog post about their brand usernames being taken by squatters and inactive accounts on twitter/facebook. I used that to reach out to them and suggest what to do to claim their usernames since I actually had the same problem. I didn't even mention or request links, but they are now linking to our homepage and referencing other pages on my site. All I did was sign my emails with my domain name, so they know who i am, where i'm from.
Essentially I made a friend by offering value, and asking for nothing in exchange. That target would have been tough to get a link from otherwise.
In case that hadn't worked out, I WOULD have eventually asked directly for the link after having made a great first impression.
-
In the past, I have offered a variety of benefits to the sites I am contacting. Here are a few I can think of off the top of my head:
-
Relevant and original blogs or articles
-
sharing their site or advertisement on social media profile
-
writing a testimonial for businesses or individuals that I have worked with
-
"how-to" articles
-
Infographics or visual guides
-
Bringing typos or broken links to webmaster's attention
-
A reciprocal link
-
-
Hi Klarke,
What would the title and first sentence of the email be if I'm doing "broken' link building?
-
Hi Derek,
What could I offer to benefit them in my case?
-
In my experience with sending out link request emails, they always want to know how it benefits them. Whether you are offering content, infographics, guest blog post, broken link corrections, reciprocal link, endorsing them on a social media or providing a testimonial for their business, I have seen the best results by telling them how it will benefit their website.
Create a compelling title that mentions the benefit so you have a higher open rate. Getting them to open is half the battle.
Also, try including your website url in the body of the message so it easy for them to click through and review your site:
"Our pagerank 4 website - http://www.example.com - which carries (Here I said what we carry) You have similar sites located in the "Other" Section on your link page."
Make sure your request is short, clear and direct. Possibly rewording your opening sentence to:
"Would you consider adding our site on the "Links" section list?"
-
Yes, they work ok - best to keep an excel sheet to track who you have emailed. If you dont get a response after a couple of weeks resend. If you still dont get a response then move on.
You can pick some links up, but just make sure you don't spam web-masters and that your requests are relevant.
-
Try using 'broken' link building.
There's several guides and posts here on Seomoz. Just search.
Basically, you scan through the list of links, and you'll probably find a few broken or 404 pages. Point those out and also recommend the addition of new resources, including your own. It works extremely well, and you don't come across as the typical emailer that the webmaster probably encounters everyday.
Round off that strategy with a regular guest posts, link bait and other content marketing ...and you're solid.
-
Michael, do you have experience using emails like the one you outlined? How well do they work?
-
Possibly this:
Hi,
Bob here from the ** * * and I wanted to drop an email to you and compliment your site. Nice layout, good info, good resources.
I was looking around at a few different sites for product/service information and I thought your's was one of the best.
That being said, I also noticed you guys have some great content related to product/service. I currently work for a company that maintains a website that offers product/service, www.domainname.com.
We are a nationally recognized, reliable source for product/service on the web and I was wondering if you'd be interested in exchanging/advertising via links between your site and mine?
If not, thanks for the time and keep up the good work!
Thanks,
Bob -
Yes, I read that. I could not locate a name for who I was writing to. I tried to structure the email according to that article. Do you have any specific suggestions?
-
Hi,
There was actually a great post on this subject a few days ago, worth taking a look. I think based on this, you could improve the structure of your email. http://www.seomoz.org/blog/how-to-write-email-to-get-a-better-response-rate
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Anyone actually getting a noticeable SEO boost from a Bitly or TinyURL backlink?
Hi, I'm looking for an example/use case of someone whose site has been linked to from another using a Bitly, or other generic URL shortener link. I'm specifically interested in proving/disproving the value of the backlink in terms of boost in SEO rankings. Ideally you somehow got a juicy backlink from a reputable site, but they accidentally linked to you using a Bitly or something, yet you saw a noticeable increase in your pages search rankings, thus proving the value of a Bitly link still passing all SEO value. Or alternatively, you got that juicy backlink and noticed nothing at all, or not much, and are frustrated that they used a BItly. I'm launching a study on this soon to identify the possible value behind short links as backlinks. Yes, I know that Matt Cutts says all short links are 301 redirects which passes something like 99.9% of link juice. I'd just like to see some use cases on this. Thanks!
Intermediate & Advanced SEO | | Rebrandly0 -
301 Redirect Showing Up as Thousands Of Backlinks?
Hi Everyone, I'm currently doing quite a large back link audit on my company's website and there's one thing that's bugging me. Our website used to be split into two domains for separate areas of the business but since we have merged them together into one domain and have 301 redirected the old domain the the main one. But now, both GWT and Majestic are telling me that I've got 12,000 backlinks from that domain? This domain didn't even have 12,000 pages when it was live and I only did specific 301 redirects (ie. for specific URL's and not an overall domain level 301 redirect) for about 50 of the URL's with all the rest being redirected to the homepage. Therefore I'm quite confused about why its showing up as so many backlinks - Old redirects I've done don't usually show as a backlink at all. UPDATE: I've got some more info on the specific back links. But now my question is - is having this many backlinks/redirects from a single domain going to be viewed negatively in Google's eyes? I'm currently doing a reconsideration request and would look to try and fix this issue if having so many backlinks from a single domain would be against Google's guidelines. Does anybody have any ideas? Probably somthing very obvious. Thanks! Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Blog On Subdomain - Do backlinks to the blog posts on Subdomain count as links for main site?
I want to put blog on my site. The IT department is asking that I use a subdomain (myblog.mysite.com) instead of a subfolder (mysite.com/myblog). I am worried b/c it was my understanding that any links I get to my blog posts (if on subdomain) will not count toward the main site (search engines would view almost as other website). The main purpose of this blog is to attract backlinks. That is why I prefer the subfolder location for the Blog. Can anyone tell me if I am thinking about this right? Another solution I am being offered is to use a reverse proxy. Thoughts? Thank you for your time.
Intermediate & Advanced SEO | | ecerbone0 -
Does anyone know of any tools that can help split up xml sitemap to make it more efficient and better for seo?
Hello All, We want to split up our Sitemap , currently it's almost 10K pages in one xml sitemap but we want to make it in smaller chunks splitting it by category or location or both. Ideally into 100 per sitemap is what I read is the best number to help improve indexation and seo ranking. Any thoughts on this ? Does anyone know or any good tools out there which can assist us in doing this ? Also another question I have is that should we put all of our products (1250) in one site map or should this also be split up in to say products for category etc etc ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Blog comments - backlinks - question
Hi, I see that many good websites have backlinks from very good blogs/sites which are relative. What I noticed that everyone use their real name or generic name in comments. They do not use the keyword for the name. So later they get backlinks with anchor text of their names... So, my question is this good technique ? Do I have any benefits from these backlinks for my website ? With such a technique, whether it is enough just to leave your real name or may I periodically put the keyword for the name ? Thank you
Intermediate & Advanced SEO | | Ivek990 -
May know what's the meaning of these parameters in .htaccess?
Begin HackRepair.com Blacklist RewriteEngine on Abuse Agent Blocking RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [NC,OR]
Intermediate & Advanced SEO | | esiow2013
RewriteCond %{HTTP_USER_AGENT} ^Bolt\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:craftbot@yahoo.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} CazoodleBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Custo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Default\ Browser\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^DIIbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^DISCo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} discobot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^eCatch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ecxi [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailCollector [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^FlashGet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GetRight [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^GrabNet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
RewriteCond %{HTTP_USER_AGENT} GT::WWW [NC,OR]
RewriteCond %{HTTP_USER_AGENT} heritrix [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^HMView [NC,OR]
RewriteCond %{HTTP_USER_AGENT} HTTP::Lite [NC,OR]
RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ia_archiver [NC,OR]
RewriteCond %{HTTP_USER_AGENT} IDBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} id-search [NC,OR]
RewriteCond %{HTTP_USER_AGENT} id-search.org [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InterGET [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} IRLbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ISC\ Systems\ iRc\ Search\ 2.1 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Java [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^JetCar [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^larbin [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} libwww [NC,OR]
RewriteCond %{HTTP_USER_AGENT} libwww-perl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Link [NC,OR]
RewriteCond %{HTTP_USER_AGENT} LinksManager.com_bot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} linkwalker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} lwp-trivial [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Maxthon$ [NC,OR]
RewriteCond %{HTTP_USER_AGENT} MFC_Tear_Sample [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^microsoft.url [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Microsoft\ URL\ Control [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Missigua\ Locator [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.NEWT [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Navroad [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NearSite [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetAnts [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^NetZIP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Octopus [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [NC,OR]
RewriteCond %{HTTP_USER_AGENT} panscient.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^pavuk [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PECL::HTTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^PeoplePal [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PHPCrawl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PleaseCrawl [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^psbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^RealDownload [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^ReGet [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Rippers\ 0 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} SBIder [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SeaMonkey$ [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Snoopy [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Steeler [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Surfbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Toata\ dragostea\ mea\ pentru\ diavola [NC,OR]
RewriteCond %{HTTP_USER_AGENT} URI::Fetch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} urllib [NC,OR]
RewriteCond %{HTTP_USER_AGENT} User-Agent [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Web\ Sucker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} webalta [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebAuto [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [NC,OR]
RewriteCond %{HTTP_USER_AGENT} WebCollage [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebCopier [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebFetch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebReaper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebSauger [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebStripper [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WebZIP [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Wells\ Search\ II [NC,OR]
RewriteCond %{HTTP_USER_AGENT} WEP\ Search [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Widow [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WWW-Mechanize [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} zermelo [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^(.)Zeus.Webster [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ZyBorg [NC]
RewriteRule ^. - [F,L] Abuse bot blocking rule end End HackRepair.com Blacklist1 -
50,000 backlinks in webmaster tools from one site???
Hi All, I'm new to evaluating backlinks, but I just saw I got over 50,000 links from a backlink that was added on ONE page at this site here: http://www.netnewspublisherDOTcom. I presume this is not a good thing, and if I contact them to remove the one link on the one page, it won't solve the other 49,999 links that Google is seeing pointing to us, so what do I do??. Should I contact them and ask to remove it and see if they don't and then disavow? Or would you just tell Google to disavow the whole site? Thanks!
Intermediate & Advanced SEO | | mlm120