Do some sites get preference over others by Google just because? Grandfathered theory
-
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche.
I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/
This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic)
Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com.
Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
-
Hello,
I understand clearly what you mean, and I can say I was on the other side.
I was among the first ranking in a smaller country for the most popular "blog" related queries. The page was a part of a more general website, was very solid and solved most of the problems for users searching for those particular queries. Most of the people were clicking my search result and they were pretty satisfied by what they got there. The website was old school, layout old scool. I had amazingly attractive title and meta description, I basically nailed it.
Then bigger brand names and huge websites were launched on the same queries. Solving way more problems and dealing with the matter in a new way. But I was still the first. Brands with pagerank 5 to 7 were competing with my page with no/or close to zero pagerank. I did not even have a fraction of their links and authority. I even laughed seeing that year after that I rank 1st above these big guys, they were on 2nd, 3rd and so on. A lot of years passed and I was still the first. It was really funny. And I tried to learn from it.
Then I decided to refresh and modify the layout, because it was old school. I had some problems with internal linking and domain was down for a while. Then somebody hacked my server and I got some stuff injected there. I solved most of the problems, it was not easy. But when I got back I lost the top spot. There were a lot of changes, but the URL and the content of that particular page was exactly the same.
So, from personal experience I can tell you that things can change.
I had the following:
-
I was the first to cover that area
-
a lot of users were clicking on my website - CTR from webmaster tools was amazingly high, and bounce rate was low
And I can tell you that one of those ranking factors talked about a lot on seomoz - "User Usage and Traffic/Query Data" weigh way more than people think. At least from my experience. Anyway, try to ask yourself the following questions:
-> Are the differences between your website and the old one significant? Do the users see them in the search listing and do they consider them of significant importance? If not, try to give them a 10th times better reason to click on your website, and also give them what they want (sometimes the bounce rate has something to say about this, but not always).
It may look like Grandfathering, it's really hard to dismiss or confirm it. At first I thought about it the same way you do. However, first it would be very nice to answer honestly and from the user's point of view to the above questions.
Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Its posible to use Google Authorship in an online shop?
Today I installed Google Authorship in my Wordpress Blog and I would like to know if its posible to implement it in my Opencart online shop. I am not interested in rich snippets because I have 9k of products and the 90% of them dont have sells nor reviews
White Hat / Black Hat SEO | | mozismoz0 -
Whether letting an old category just 404 out is OK
Hello, We've got some hidden categories that are still indexed in the search engines. If there are no links to these hidden categories, can we just let them 404 out and be OK SEO wise? We can't 301 redirect them. It's about 50 categories.
White Hat / Black Hat SEO | | BobGW0 -
If our site hasn't been hit with the Phantom Update, are we clear?
Our SEO provider created a bunch of "unique url" websites that have direct match domain names. The content is pretty much the same for over 130 websites (city name is different) that link directly to our main site. For me this was a huge red flag, but when I questioned them and they said it was fine. We haven't seen a drop in traffic, but concerned that Google just hasn't gotten to us. DA for each of these sites are 1 after several months. Should we be worried? I think yes, but I am an SEO newbie.
White Hat / Black Hat SEO | | Buddys0 -
How to fix doorway site
Hello, This client has a one page doorway site that is a copy of a category of his main site. It looks like the main site and has over 100 links to the site. We're cleaning things up to be white hat and we're wondering how to capture this traffic and link juice (this doorway has no backlinks though, it is an EMD) without a ton of money and effort. My thought so far is to put a uniquely designed paypal cart on there with top products and one link to the main site that says something like: To pay by credit card or to see more products, visit mainsite.com Would that be squeeky clean white hat or is that still a doorway headed for an update? What's best to do here on a low budget?
White Hat / Black Hat SEO | | BobGW0 -
Why did Google reject us from Google News?
I submitted our site, http://www.styleblueprint.com to Google to pontentially be a local news source in Nashville. I received the following note back: We reviewed your site and are unable to include it in Google News at this
White Hat / Black Hat SEO | | styleblueprint
time. We have certain guidelines in place regarding the quality of sites
which are included in the Google News index. Please feel free to review
these guidelines at the following link: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35769#3 Clicking the link, it anchors to the section that says: These quality guidelines cover the most common forms of deceptive or manipulative behavior, but Google may respond negatively to other misleading practices not listed here (e.g. tricking users by registering misspellings of well-known websites). It's not safe to assume that just because a specific deceptive technique isn't included on this page, Google approves of it. Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit. etc... Now we have never intentionally tried to do anything deceptive for our rankings. I am new to SEOmoz and new to SEO optimization in general. I am working through the errors report on our campaign site but I cannot tell what they are dinging us for. Whatever it is we will be happy to fix it. All thoughts greatly appreciated. Thanks in advance, Jay0