Is it possible to have good SEO without links and with only quality content?
-
Is it possible to have good SEO without links and with only quality content? Have you any experience?
-
Alex, sorry it's taken a bit for us to get this one published -- but I wanted to let you know, this Whiteboard Friday will be published tomorrow morning, 10/24.
-
Possible? Yes. Likely? No. And I'm assuming that by good SEO you mean ranking well in Google.
Links are still the biggest factor for ranking. Matt Cutts repeated this again recently and studies back it up. Don't let the anti-link builders, pro-relationship builders, or whatever they're calling themselves at the moment brainwash you.
-
Hi Chris, Rand, Travis, Zippy and all the fans-moz,
In our agency we have very good results in some sites only with quality content, but ... but only on websites with easy competition and also for the quality of content, are gaining natural links (as it should be :)) .
My answer to the question "Is it possible to have good SEO without links and with only quality content?" is: yes and no.
You can only do a good SEO with quality content that these contents are slowly gaining good links.My answer to the question "Is it possible to have good SEO without linkbuilding and with only without quality content?" is: YES
The link building is a dialogue and not a single order, the link building is an alliance of mutual benefit rather than a purchase. -
...great
-
I've managed a few campaigns where the client had zilch domain age, in a competitive space. My team and I squeezed everything we could out of on-page. The results were in line with my expectations. (Local targeting. The clients showed on the first page within a couple weeks. I have high expectations.)
Granted, we do get a handful of links at the beginning. Not doing so is just crazy talk. Though I realize this is a discussion thread.
What I will say is that I'm getting more traction with less links. So either we're just getting stupid lucky with links, or we've become god-like with on-page. Though I would realistically think that on-page is getting a significant boost and we're doing as well as we've ever done; perhaps a bit better, given experience.
-
Hi Alex,
I've some trepidation about going up against whiteboard Friday but my experience is that it is possible for less competitive keywords. I do inhouse SEO for a company in an industrial B2B market. To a large extent there are few link building opportunities and most of the ones there are on directory sites. There are no blogs and social media is non-existent.
So we target about a 100 keywords that have a moz difficulty of between 17 and 25%. They probably have about 50 - 200 global exact searches a month on Google. A single converting enquiry can lead to $200,000 in sales.
So given that we, and all our competitors, have little support from link building, the battle is all about onpage optimisation. Out of maybe 100 global competitors about 20 have a web presence that is more than trivial. Of these there are 3 companies (including mine) that dominate search rankings (98% of 1-3 positions of the keywords we target are held by one of these 3).
Page and Domain authorities are in the low thirties and many product pages have a PA of 1. Life to a large extent consists in identifying new non obvious keywords for link bait articles that then drive traffic to product pages, and also in taking existing keywords and breaking them apart into more exact matches.
-
Hi Alex - I actually filmed a whiteboard friday about this today! In the next few weeks, you should see it go to the main blog (and I cited you in there - hope that's OK)
-
Alex,
It is possible to have good on-page SEO, meaning that the site is crawalable, copy aligns with meta data, internal linking and navigation are worded correctly, and keyword research was done appropriately. However if the keywords you've chosen to target were also targeted by competitors with sites/pages that have have back links pointing at them, it can be very difficult, if not impossible, to compete against them without sufficient back links of your own. It boils down to the fact that links are an important ranking factor and most of the time (unless you target super-uncompetitive keywords) you need them to be competitive.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Many pages small unique content vs 1 page with big content
Dear all, I am redesigning some areas of our website, eurasmus.com and we do not have clear what is the best
Intermediate & Advanced SEO | | Eurasmus.com
option to follow. In our site, we have a city area i.e: www.eurasmus.com/en/erasmus-sevilla which we are going
to redesign and a guide area where we explain about the city, etc...http://eurasmus.com/en/erasmus-sevilla/guide/
all with unique content. The thing is that at this point due to lack of resources, our guide is not really deep and we believe like this it does not
add extra value for users creating a page with 500 characters text for every area (transport...). It is not also really user friendly.
On the other hand, this pages, in long tail are getting some results though is not our keyword target (i.e. transport in sevilla)
our keyword target would be (erasmus sevilla). When redesigning the city, we have to choose between:
a)www.eurasmus.com/en/erasmus-sevilla -> with all the content one one page about 2500 characters unique.
b)www.eurasmus.com/en/erasmus-sevilla -> With better amount of content and a nice redesign but keeping
the guide pages. What would you choose? Let me know what you think. Thanks!0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
What tags/coding are not good for SEO?
what tags/coding are not good for SEO? and also what tags not to include while creating website. For example - I read some where to avoid Span tag.
Intermediate & Advanced SEO | | JordanBrown0 -
Internal page links and possible penalties
If one looks at a page on our client's website, (http://truthbook.com/urantia-book/paper-98-the-melchizedek-teachings-in-the-occident for example), there are a huge amount of links in the body of the page. All internal links are normal links. All external links arerel="nofollow" class="externallink" We have two questions: 1. Could we be being penalized by google for having too many links on these pages? Will this show i our webmaster reports? 2. If we are being penalized, can we keep the links (and have no penalty) if we made the internal links rel="nofollow" class="externallink" as well? We need these internal links to help people use these pages as an educational tool. This is why these pages also have audio and imagery. Thank you
Intermediate & Advanced SEO | | jimmyzig0 -
Unnatural links to your site—impacts links
I got message in my Google webmaster tool: Unnatural links to your site—impacts links Does anyone knows the difference between "Unnatural links to your site—impacts links" and "Unnatural links to your site" Thank you Sina
Intermediate & Advanced SEO | | SinaKashani0 -
What SEO tactics are effective for optimising a site where content is changing very frequently (for example an online newspaper)?
I have always worked with sites where content has a reasonably long life-span but need to now consider SEO for a site where content is changing very rapidly. I have read that Google will re-spider your content more frequently if it finds that it is changing frequently but are there effective ways to let the search engines know as new articles are published? Also, if content is removed within only a day or two of being published, can this have a negative impact on SEO?
Intermediate & Advanced SEO | | Greenie0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1 -
How to seo : two domain having exactly the same content
Target website in question is ikt.co.id,it is hosted in our server located in US, what I am going to do right now is create a new subdomain id.ikt.co.id that served EXACTLY the same content but it is hosted in our Indonesia server. Whenever people go to any page within ikt.co.id, it will detect their country, if they are from Indonesia, I will redirect them to our Indonesia server. Okay, from SEO point of view I know there are couple problems such as Content duplication, and perhaps there are more. I think to handle the content duplication I can cannonical all URL on id.ikt.co.id to the ikt.co.id version instead. The same thing for social sharing, all links shared will be the one from ikt.co.id, so all of those link juices go to ikt.co.id, and for good measure I can also set the robots.txt to tell them not to index id.ikt.co.id ... All sounded good to me, until the I became paranoid and start thinking "have I missed anything that might hurt my SERP" ? here is the question, did i miss something important, if i did could you please tell me what it is and if possible brought the solution you think might work into this discussion?? Again thanks a lot for your help 😃
Intermediate & Advanced SEO | | IKT0