We are ignored by Google - what should we do?
-
Hi,
We believe that our website - https://en.greatfire.org - is being all but ignored by Google Search. The following two examples illustrate our case.
1. Searching for “China listening in on Skype - Microsoft assumes you approve”. This is the title of a blog post that we wrote which received some 50,000 visits. On Yahoo and Bing search, we rank first for this search. On Google, however, we rank 7th. Each of the six pages ranking higher than us are quoting and linking to our story.
2. Searching for “Online Censorship In China”. This is the title of our front page. Yahoo and Bing both rank us third for this search. On Google, however, we are not even among the first 300 results. Two of the pages among the first 10 results link to us.
Our website has an average of around 1000 visits per day. We are quoted in and linked from virtually all Western mainstream media (see https://en.greatfire.org/press). Yet to this day we are receiving almost no traffic from Google Search.
Our mission is to bring transparency to online censorship in China. If people could find us in Google, it would greatly help to spread awareness of the extent of Internet restrictions here. If you could indicate to us what the cause of our poor rankings could be, we would be very grateful. Thank you for your time and consideration.
-
Hi Matt,
Thanks for your reply. I think the fact that we gained a lot of backlinks and then lost them was due to our very highly quoted and linked story in December (the Skype story, used as an example in our first post). Many websites put links to us on their front pages. Inevitably, these only stay until pushed down and off the page by newer stories.
We have not create fake links anywhere. According to Google Analytics, visitors have entered our site through links on 904 websites since Dec 1. The top ones are Reddit, YCombinator, Twitter, habrahabr.ru, Facebook, TheNextWeb and Wikipedia. All very legitimate links, as far as I can understand.
What do you think we should do? Why does https prevent using a link profile tool?
-
Great post Matt. You nailed it.
Best regards,
Devanur Rafi.
-
http://dejanseo.com.au/hijacked/
This is a recent test - and one that may apply (though I still maintain it's link profile.)
-
Actually, I'm pretty sure your problem is in your link profile.
http://www.highonseo.com/examples/ahrefs1.jpg
The first image shows your ahrefs backlink profile. You nearly-instantly gained a couple thousand backlinks. Then lost a bunch quickly as well
So my next question was "are these legit?"
Now look at image 2.
http://www.highonseo.com/examples/ahrefs2.jpg
Out of 92,293 backlinks, you have over 90,000 dofollow links, including over 80,000 sitewide links. 1600 .govs, which is nearly more than your nofollow links.
My brain can't process a link profile that looks like this. I would love to pull it into a link profile tool to check the DA of your backlinks but because you're https, I can't.
Just speculation on my part but if someone told me they had over 97% dofollow links, as many edu as nofollow and had a huge gain and then watched those links falling off, I'd quickly believe something was wrong. I always assume Google is two steps ahead of me. So if I think this backlink profile looks wonky, they must think it's worse.
-
I heard they will give the ranking of the content to the more powerful site? not sure if thats correct. If they thought you had copied it then perhaps no ranking at all?
-
Yes. But shouldn't Google be good at determining that? For one, they all or almost all link back to our original story - not the other way around. Secondly, our story is always published before theirs and Google should detect that.
If this is the case, it doesn't explain why we have no ranking at all on the title of our front page.
-
Could it be that the big sites quoting some of your text are seen as the orgininal source as they are very high domain authority websites?
-
No problem my friend. You are most welcome. If you wanted to go for HTTPS intentionally then it is ok. However, it seems Google does not treat HTTPS the way it should as of now. Probably at some point later this may change and who know if they have already rolled it out and it is just under way. Bigger changes like this take time to propagate fully through out. Till that time, all that we can do is sit tight and have our fingers crossed
Best regards,
Devanur Rafi.
-
Thanks Devanur. Very interesting idea. However, we do want to keep our whole website as HTTPS - to make it more difficult to track what our users do on it, and also to encourage other websites to move the HTTPS as well. The more the better. For example, all of GitHub is already HTTPS-only. If HTTPS is indeed the reason it's quite a scandal that Google can't deal with it properly.
-
Hi there,
Though as per Google, it is ok (http://www.youtube.com/watch?v=xeFo4ytOk8M) to go for https for your entire website, personally, I saw in many instances where https URLs find it very difficult competing with http URLs in Google.
Normally, I do not see a need to go in for https for plain pages that do not need to be served over https. Only the secure pages that might need a login to access them may be served over https. Hope our friends over here will jump in with their views.
Let me conclude by saying, I would go for http for all the pages that I desire to rank high in Google and this view is based solely on my personal experience.
Hope it helps.
Best regards,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google slow to index pages
Hi We've recently had a product launch for one of our clients. Historically speaking Google has been quick to respond, i.e when the page for the product goes live it's indexed and performing for branded terms within 10 minutes (without 'Fetch and Render'). This time however, we found that it took Google over an hour to index the pages. we found initially that press coverage ranked until we were indexed. Nothing major had changed in terms of the page structure, content, internal linking etc; these were brand new pages, with new product content. Has anyone ever experienced Google having an 'off' day or being uncharacteristically slow with indexing? We do have a few ideas what could have caused this, but we were interested to see if anyone else had experienced this sort of change in Google's behaviour, either recently or previously? Thanks.
Intermediate & Advanced SEO | | punchseo0 -
Google Hummingbird Update - Any Changes ?
Google has update with the new alogrithm and did you see any effects and as they are not revelaing the techinicaly how they work ? What's your opinion ?
Intermediate & Advanced SEO | | Esaky0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Sites banned from Google?
How do you find out sites banned from Google? I know how to find out sites no longer cached, or is it the same thing once deindexed? As always aprpeciate your advice everyone.
Intermediate & Advanced SEO | | pauledwards0 -
Why my blog ranks poorly on Google ?
Hi 🙂 I need help for my blog, my blog http://www.dota2club.com/ for many keywords it is not in first 50 results on google. What am i doing wrong ? Can you tell me what errors / mistakes i have made and what can i do to improve my blog ? Thank you !!!
Intermediate & Advanced SEO | | wolfinjo0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Holiday hijack lowers Google ranking
A client of ours made a mistake that has dropped their Google rankings. They posted a holiday greeting on our homepage for several weeks, and now the search engines are not picking the page up at all. Any thoughts or suggestions on how to repair this?
Intermediate & Advanced SEO | | Event360300 -
Why do i not receive google traffic?
over the 4-5 months i have published over 3000 unique articles which i have payed well over 10 000usd for, but i still only receive about 20 google visitors a day for that content. i uploaded the 3000 articles after i 301 redirected the old site to a a new domain (old site had 1000 articles, and at least 300visits from google a day), and all the old conetnt receives the traffic fine (301 redirect is working 100percent now and pr went from 0 to 3pr) articles are also good ranging from 400-800 words. 90 percent of them are indexed by google, most of them have been bookmarked to digg reddit etc website domain is over 10 years old - alltopics.com why google doesnt send me the traffic i deserve?
Intermediate & Advanced SEO | | rxesiv0