Google Bombing For A Specific URL
-
The term "Beruk" which means "Ape or Monkey" in english brings up this page of wikipedia amongst the first page result: URL: http://en.wikipedia.org/wiki/Khairy_Jamaluddin
The page does not contain the word "Beruk".
External links to the page do not contact the anchor-text "Beruk"
Given the above scenario, how is the page still ranking on first page for this keyword?
-
Hi Dunamis,
I was wondering about the same. If Google sees historical search queries using "Term A" + "Term B" would it associate both the words so strongly.. that even when only one of the search term is being used the others relevance still gets quantifies? Hard to believe Google could be going something like this.
Still an open question for now. Let's see if we get any more explanations.
-
Thanks Ryan for the historical wiki dig. But I doubt Google would have something so old still influence todays results (especially when the page was edited a long time ago removing all traces of the word "Beruk")
However this could be one of the possible explanations.
-
This could also have something to do with how Google determines relevance. If a user types in "black cat", sees their search results and then immediately goes back and types in "black kitten", Google can determine that cat and kitten are relevant. If enough people do it they will figure out that when someone types cat, they could mean kitten. The algorithm is more complicated than that though and Google is always learning.
So, I would guess that Google has figured out that when someone searches for Beruk, that that word is really relevant to the word monkey. And then, the Wikipedia page is very relevant to monkies, especially the type of monkeys that people are looking for when they type in Beruk.
-
If you google phases with N.I. in them google shows results with Northern Ireland in the serps (bolded and all), maybe google doing something similar here?
-
Wikipedia is such an incredible strong site that Google clearly places them on a pedastal. This is purely a case of domain rank. To learn why the term Beruk is associated with that page you need to look at the page's history. In July 2008, about 150 page edits ago, a wiki reader decided to edit the page and use the term "beruk" as an insult. That is how the term became associated with the page. http://en.wikipedia.org/w/index.php?title=Khairy_Jamaluddin&oldid=226010042 This page would be a good example for the Google team to examine and then adjust their metrics.
-
Apparently, in 2007 Jamaluddin was involved in some kind of controversy concerning an HIV-positive monkey (http://ms.wikipedia.org/wiki/Khairy_Jamaluddin#Isu_beruk, I used Google Translate but it's not very clear).
Possibly a lot of pages just link to his wiki article using the work Beruk as past of the anchor text, or maybe even just as words surrounding the anchor text
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much does doing google search queries dilute your search console data
So, does performing dozens or hundreds of search queries a day dilute your search console data, or does google filter this out or how does this work exactly? When you do an icognito search and click on your site does this information get recorded in search console?
White Hat / Black Hat SEO | | jfishe19880 -
My indexed site URL removed from google search without get any message or Manual Actions???
On Agust 2 or 3.. I'm not sure about the exact date...
White Hat / Black Hat SEO | | newwaves
The main URL of my website https://new-waves.net/ had been completely removed from Google search results! without getting any messages or Manual Actions on search console ?? but I'm still can find some of my site subpages in search results and on Google local maps results when I tried to check it on google
info:new-waves.net >> no results
site:new-waves.net >> only now I can see the main URL in results because I had submitted it again and again to google but it might be deleted again today or tomorrow as that happen before last few days
100% of all ranked keywords >> my site URL new-waves.net had been completely removed from all results! but I'm still can see it on maps on some results I never get any penalties to my site on Google search console. I noticed some drops on some keywords before that happens (in June and July) but it all of it was related to web design keywords for local Qatar, but all other keywords that related to SEO and digital marketing were not have any changes and been on top My site was ranked number 1 on google search results for "digital marketing qatar" and some other keywords, but the main URL had been removed from 100% of all search results. but you can still see it on the map only. I just tried to submit it again to Google and to index it through google search console tool but still not get any results, Recently, based on google console, I found some new links but I have no idea how it been added to links of my website:
essay-writing-hub.com - 9,710
tiverton-market.co.uk - 252
facianohaircare.com - 48
prothemes.biz - 44
worldone.pw - 2
slashdot.org - 1
onwebmarketing.com - 1 the problem is that all my high PR real links deleted from google console as well although it still have my site link and it could be recognized by MOZ and other sites! Can any one help to know what is the reason?? and how can I solve this issue without losing my previous ranked keywords? Can I submit a direct message to google support or customer service to know the reason or get help on this issue? Thanks & Regards0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
Google's Related Searches - Optimizing Possible?
Does anyone know how Google determines what suggestions show up at the bottom of SERPs? I've been working with a client to boost his local ranking, but every time we do a branded search for his business his competitors keep popping up in the "Searches related to ______" section.
White Hat / Black Hat SEO | | mtwelves0 -
Does this URL need rewriting?
Hello, Does this URL need to be rewritten? http://www.nlpca.com/DCweb/modelingwithnlparticleandreas.html Bob
White Hat / Black Hat SEO | | BobGW0 -
How does the Google Treat 301 Redirects?
Hi, My website was one of many that dropped in rankings this last Friday, The company that i outsourced my SEO 4 months ago did a bad job. Now i'm doing everything my self to recover, so i was thinking getting a new hosting, duplicate the website with a same content (i have original quality content) and 301 my old domain to new one? How long can it last with Google? Can penalties be passed via 301 redirects ? Looking forward to your help.
White Hat / Black Hat SEO | | mezozcorp0 -
What are the biggest optimization factors for Google Places?
I know some of the basic factors to rank better on Google Places, but I'm looking to see where the priority is and if there are negative factors?
White Hat / Black Hat SEO | | anchorwave0 -
Interesting case of IP-wide Google Penalty, what is the most likely cause?
Dear SEOMOZ Community, Our portfolio of around 15 internationalized web pages has received a significant, as it seems IP-wide, Google penalty starting November 2010 and have yet to recover from it. We have undergone many measure to lift the penalty including reconsideration requests wo/ luck and am now hoping the SEOMoz community can give us some further tips. We are very interested in the community's help and judgement what else we can try to uplift the penalty. As quick background information, The sites in question offers sports results data and is translated for several languages. Each market, equals language, has its own tld domain using the central keyword, e.g. <keyword_spanish>.es <keyword_german>.de <keyword_us>.com</keyword_us></keyword_german></keyword_spanish> The content is highly targeted around the market, which means there are no duplicate content pages across the domains, all copy is translated, content reprioritized etc. however the core results content in the body of the pages obviously needs to stay to 80% the same A SEO agency of ours has been using semi-automated LinkBuilding tools in mid of 2010 to acquire link partnerships There are some promotional one-way links to sports-betting and casino positioned on the page The external linking structure of the pages is very keyword and main-page focused, i.e. 90% of the external links link to the front page with one particular keyword All sites have a strong domain authority and have been running under the same owner for over 5 years As mentioned, we have experienced dramatic ranking losses across all our properties starting in November 2010. The applied penalties are indisputable given that rankings dropped for the main keywords in local Google search engines from position 3 to position 350 after the sites have been ranked in the top 10 for over 5 years. A screenshot of the ranking history for one particular domain is attached. The same behavior can be observed across domains. Our questions are: Is there something like an IP specific Google penalty that can apply to web properties across an IP or can we assume Google just picked all pages registered at Google Webmaster? What is the most likely cause for our penalty given the background information? Given the drops started already in November 2010 we doubt that the Panda updates had any correlation t this issue? What are the best ways to resolve our issues at this point? We have significant history data available such as tracking records etc. Our actions so far were reducing external links, on page links, and C-class internal links Are there any other factors/metrics we should look at to help troubleshooting the penalties? After all this time wo/ resolution, should we be moving on two new domains and forwarding all content as 301s to the new pages? Are the things we need to try first? Any help is greatly appreciated. SEOMoz rocks. /T cxK29.png
White Hat / Black Hat SEO | | tomypro0