How long does a new domain need to get a specific level of trust?
-
We are a small start-up in germany in the Sports and health sector. We currently are building a network of people in that sector and give each person a seperate wordpress blog. The idea is to create a big network of experts.
My question is: How long is the period for google to trust a completely new URL?
We set up each project and create content on the page. Each week the owner of the site puts up an expert article that contain keywords. And we set certain links from other blogs, etc.
Also, do you think it is more important for a site to get say, 20 backlinks from anywhere. Or 5 backlinks from very trusted blogs, etc.?
-
I would not give the experts a blog on the wordpress sub domain be sure that when they blog it is on a sub folder for your website so when links are built they benefit your site directly and not wordpress.
-
I would suggest you take a look at this page on MozTrust. As the name indicates, MozTrust is an tool which measures trust factors for a website.
MozTrust and PR are similar metrics. They are both attempts to determine a site's importance and credibility. The largest factor is your site's ability to earn credible external links from other credible sites.
Example 1- you create a natural viagra-like sex supplement using common ingredients from other similar pills. You set up an e-commerce site and sell your product. It will likely take you a long time (i.e. years) to build up trust unless you pour an enormous amount of resources (i.e. money) into the site.
Example 2 - you create a natural viagra-like sex supplement based on credible research from UC Berkeley or another credible institution. You perform authentic studies by doctors and the results are published in the New England Journal of Medicine and other credible medical journals. The doctors and researchers involved in the study all post numerous articles on your site, and respond to questions.
As a result of the above activities, the New York Times, CNN and other credible news sources cover the story and link to your site. Additionally the doctors involved with the study are asked to be interviewed on Oprah and other television shows. All the media hype turns into hundreds of links from highly credibly sites and a lot of social media buzz.
The second example can help a brand new site very quickly earn a lot of trust. Then the product begins selling, authentic testimonials are received, further research is performed, more doctors and patients begin working with the product, leading to even more credibility and trust.
-
Thanks for you quick answer Ryan,
what I mean with trust is that at a certain point google starts to trust a website based on the content it has. Google pays more attention to that website and links count more from it. It gets a kind of jump in importance.
At least, that is what I have noticed. Do you know if there are key factors that trigger this or if there is a certain time period which google needs?
-
How long is the period for google to trust a completely new URL?
Trust is earned over time with links. Some sites will gain it very quickly, while others will never achieve high levels of trust.
The first question is, how exactly do you define "trust"? You could use PR to measure trust, but everything is relative. If you only consider a PR 10 site as trustworthy, it is 99.99% likely that your site will never be trustworthy. As of Aug 4th, 2011 there are only 14 PR 10 websites (pages) worldwide, yet there are tens of millions of websites.
Even if you establish a certain level as trustworthy, such as PR 7, the next issue is measuring PR. Google only releases PR toolbar updates 3-4 times each year, but the figures are updated daily internally.
If you decided PR 7 was your goal (as an example) then it is possible to achieve a PR 7 site quickly if you could pump enough resources (i.e. money) into the site. If you created a well-designed, quality site which offered a product, service or information that was credible enough to cause enough interest, then it can certainly be done.
do you think it is more important for a site to get say, 20 backlinks from anywhere. Or 5 backlinks from very trusted blogs, etc.?
I would prefer 1 quality link in content from a trusted blog or other quality source then 100 "backlinks from anywhere".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating Subfolder content to New domain Safely
Hello everyone, I'm currently facing a challenging situation and would greatly appreciate your expertise and guidance. I own a website, maniflexa.com, primarily focused on the digital agency niche. About 3 months ago, I created a subfolder, maniflexa.com/emploi/, dedicated to job listings which is a completely different niche. The subfolder has around 120 posts and pages. Unfortunately, since I created the subfolder, the rankings of my main site have been negatively impacted. I was previously ranking #1 for all local digital services keywords, but now, only 2 out of 16 keywords have maintained their positions. Other pages have dropped to positions 30 and beyond. I'm considering a solution and would like your advice: I'm planning to purchase a new domain and migrate the content from maniflexa.com/emploi/ to newdomain.com. However, I want to ensure a smooth migration without affecting the main domain maniflexa.com rankings and losing backlinks from maniflexa.com/emploi/ pages. Is moving the subfolder content to a new domain a viable solution? And how can I effectively redirect all pages from the subfolder to the new domain while preserving page ranks and backlinks?
Intermediate & Advanced SEO | | davidifaso
I wish they did, but GSC doesn't offer a solution to migration content from subfolder to a new domain. 😢 Help a fellow Mozer. Thanks for giving a hand.0 -
How to Get Permalinks Indexed?
Hey Everyone, I'm so happy to be apart of this community and assert knowledge where and when I can. I joined the community for one specific reason and I hope to employ the help of everyone here in conjunction with solving my SEO problem. I have a few years experience in SEO/SEM and have been continuously learning, while learning to adapt to continuous changes (I think we can all relate lol). At any rate, here is what I am experiencing frustration with. I'm the SEO Analyst for a company that is trying to compete for the keyword phrase "Lyft Promo Code". We have been trying to place page one on google for over a year now to no avail. I have gotten my direct domain url to appear on pages 1 & 2, but can't seem to get permalinks or "Sub-URL's" indexed. If you google this phrase you will see what I mean. The top result is:http://rideshareapps.com/lyft-promo-code-credit/
Intermediate & Advanced SEO | | Number_One_Deisgns
This url has an aggregated rating and appears page one for the phrase aforementioned above. What we have managed to do, as I mentioned is get www.couponcodeshero.com on page two. However, we have noticed that the page one trend is all permalinks. However when we have tried to emulate the pages structure and index priority, we are unable too. Our page:
http://couponcodeshero.com/lyft-promo-code-rideshare-guide/ I have ran multiple on-page graders from many resources and have not been able to get this page indexed as a permalink on any page that directly correlates with the Keyword Phrase. In essence, I'm looking for some direction from individuals who may have experienced this before. I have spent a good amount of time Googling and searching forum databases but can not find any direct content that explains how to index a permalink. I hope to get some great ideas from the individuals here! If you do know of any articles or even previously answered questions here please direct me there. it is only my intention to add value to the community! Schieler Mew
Number One Designs0 -
Legacy domains
Hi all, A couple of years ago we amalgamated five separate domains into one, and set up 301 redirects from all the pages on the old domains to their equivalent pages on the new site. We were a bit tardy in using the "change of address" tool in Search Console, but that was done nearly 8 months ago now as well. Two years after implementing all the redirects, the old domains still have significant authority (DAs of between 20-35) and some strong inbound links. I expected to see the DA of the legacy domains taper off during this period and (hopefully!) the DA of the new domain increase. The latter has happened, although not as much as I'd hoped, but the DA of the legacy domains is more or less as good as it ever was? Google is still indexing a handful of links from the legacy sites, strangely even when it is picking up the redirects correctly. So, for example, if you do a site:legacydomain1.com query, it will give a list of results which includes pages where it shows the title and snippet of the page on newdomain.com, but the link is to the page on legacydomain1.com. What has prompted me to finally try and resolve this is that the server which hosted the original 5 domains is now due to be decommissioned which obviously means the 301 redirects for the original pages will no longer be served. I can set up web forwarding for each of the legacy domains at the hosting level, but to maintain the page-by-page redirects I'd have to actually host the websites somewhere. I'd like to know the best way forward both in terms of the redirect issue, and also in terms of the indexing of the legacy domains? Many thanks, Dan
Intermediate & Advanced SEO | | clarkovitch0 -
Consolidating two separate domains and redirecting towards a new replatformed domain
A client has two different sites selling the same products with the same content, they would like to replatform onto Magento while redirecting those 2 sites to the new URL. The question is, besides monitoring the 301 redirects is there anything else to take into consideration when consolidating two sites into one new site?
Intermediate & Advanced SEO | | RocketWeb0 -
Want to merge high ranking niche websites into a new mega site, but don't want to lose authority from old top level pages
I have a few older websites that SERP well, and I am considering merging some or all of them into a new related website that I will be launching regardless. My old websites display real estate listings and not much else. Each website is devoted to showing homes for sale in a specific neighborhood. The domains are all in the form of Neighborhood1CityHomes.com, Neighborhood2CityHomes.com, etc. These sites SERP well for searches like "Neighborhood1 City homes for sale" and also "Neighborhood1 City real estate" where some or all of the query is in the domain name. Google simply points to the top of the domain although each site has a few interior pages that are rarely used. There is next to zero backlinking to the old domains, but each links to the other with anchor text like "Neighborhood1 Cityname real estate". That's pretty much the extent of the link profile. The new website will be a more comprehensive search portal where many neighborhoods and cities can be searched. The domain name is a nonsense word .com not related to actual key words. The structure will be like newdomain.com/cityname/neighborhood-name/ where the neighborhood real estate listings are that would replace the old websites, and I'd 301 the old sites to the appropriate internal directories of the new site. The content on the old websites is all on the home page of each, at least the content for searches that matter to me and rank well, and I read an article suggesting that Google assigns additional authority for top level pages (can I link to that here?). I'd be 301-ing each old domain from a top level to a 3rd level interior page like www. newdomain/cityname/neighborhood1/. The new site is better than the old sites by a wide margin, especially on mobile, but I don't want to lose all my top positions for some tough phrases. I'm not running analytics on the old sites in question, but each of the old sites has extensive past history with AdWords (which I don't run any more). So in theory Google knows these old sites are good quality.
Intermediate & Advanced SEO | | Gogogomez0 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0 -
How important are domain names?
Hi All, Question: How important are domain names when trying to rank for a competitive keyword? Thanks
Intermediate & Advanced SEO | | wazza19850