City and state link stuffing in footer
-
A competitor has links to every state in the U.S., every county in our state and nearby states, and every city in those nearby states. All with corresponding link text and titles that lead to pages with thin, duplicate content. They consistently rank high in the SERPS and have for years. What gives--I mean, isn't this something that should get you penalized?
-
Thanks for your response, Will. It's small business (maybe 10 or 12 employees) at a single location. While they don't really impact me directly, it's particularly bothersome because they are in the advertising and marketing business. We tell clients not to do these things, but all around there are agencies that succeed using these tactics.
-
Hi There!
Unfortunately, as both Ben and Pau are mentioning, this absurd practice is still hanging around the web. While it's very unlikely the stuffed footer is actually helping this competitor to achieve high rankings, it is aggravating to think it isn't preventing them, either.
Your post doesn't mention whether this is actually a business model with physical local offices or is fully virtual, but what I have seen in cases like these is that big brands tend to get away with a great deal of stuff I would never recommend to a smaller brand. It begs the question: how can we explain this phenomenon?
In the past, I've seen folks asserting that Google is soft on big brands. There could be some truth in this, but we've all seen Google take a massive whack at big brand practices with various updates, so that really makes this an unsatisfying assertion.
Another guess is that big brands have built enough supporting authority to make them appear immune to the consequences of bad practices. In other words, they've achieved a level of power in the SERPs (via thousands of links, mentions, reviews, reams of content, etc.) that enables them to overcome minor penalties from bad practices. This could be closer to the truth, but again, isn't fully satisfactory.
And, finally, there's the concept of Google being somewhat asleep at the wheel when it comes to enforcing guidelines and standards, and whether or not that's kind of excusable given the size of the Internet. They can't catch everything. I can see this in this light, but at the same time, don't consider Google to have taken a proactive stance on accepting public reporting of bad practices. Rather, they take the approach of releasing periodic updates which are supposed to algorithmically detect foul play and penalize or filter it. Google is very tied to the ideas of big data and machine intelligence. So far, it's been an interesting journey with Google on this, but it is what has lead to cases exactly like the one you're seeing - with something egregiously unhelpful to human users being allowed to sit apparently unpunished on a website that outranks you, even when you are trying to play a fairer game by the rules.
In cases like this, your only real option is to hang onto the hope that your competitor will be the subject of an update, at some point in the future, that will lessen the rewards they are receiving in the face of bad practices. Until then, it's heads down, working hard on what you can do, with a rigorous focus on what you can control.
-
I've seen a lot of websites that do similar things and rank high on SERP's...
Sometimes this can be explained in some part by a good backlink profile, old domain / website, high amount of content (if the content is relatively original and varied), or because the niche is more receptive to this type of content (when it's something relatively common on your niche)... and other times simply makes no sense why things like this are working in Google for years without getting automatically or manual penalyzed.
Iv'e seen webs with so big keyword stuffing repeating a keyword about 500 times in the homepage, and being ranked in the top of Google for that keyword without seeing nothing internal or external of that website appart of this that can explain that awesome ranking. It's so frustrating knowing that this is penalized by Google and some of your competitors are doing it with impunity while you can't or at least you shouldn't...
-
Hi!
Yes, this absolutely should get them penalized. Unfortunately, I have also seen this work very well for different competitors in various niches. Regardless of what Google says, some old black-hat tactics still work wonders and these sites often fly under the radar. For how long is the question though. It still carries a heavy risk. If they are discovered, they can get a serious penalty slapped on them or at the very least get pushed pretty far down the SERPS. It's really just risk vs. reward. If you are like me, I work for a company that has a ton of revenue at stake, so I think of it like this.
It is much easier for me to explain to them why these thin, low-quality sites are ranking because of a loophole than it would be for me to explain why I got our #1 lead generating channel penalized and blasted into purgatory.
Usually, these sites that use these exact-match anchors on local terms look like garbage. So even if they are driving traffic, I often wonder how much of it is actually converting since the majority of their site looks like a collection of crappy doorway pages. It is still very frustrating to watch them succeed in serps though. I have the same issue.
You could always "try" to report them to Google directly. I do not know if this really works or if anchor-text spam would fall under one of their official categories to file it under, but you could try submitting a spam report here: https://www.google.com/webmasters/tools/spamreport.
I have no idea if this works or not though. Also as a side note, I would run their site through a tool like Majestic SEO or AHREFS and really dig on their backlink profile. I have seen a couple of instances where some spammy sites pulled off some nice links, so their success could also be attributed to those as well.
Hopefully, this helps, I know your pain.
-Ben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Footer Section
As we have service oriented website, so what all links we need to add in our footer section?
On-Page Optimization | | Obbserv0 -
Best practice for footer in ecommerce - Shall I add Top Category links?
What would you recommend regarding links to "Top Products" and "Top Categories" in footer? Would you add them to give extra link juice to top categories? would you try to avoid category links in footer that are already in the header navigationor in the main content area to avoid linking twice from all pages? would you vary these top category links in footer according to main category
On-Page Optimization | | lcourse0 -
Duplicate links from forum what to do?
After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ I don't need SEO on forum so what should I do? What could be an easy solution to this? No index? No follow? Please help
On-Page Optimization | | OVJ0 -
Unable to see internal link numbers on Opensiteexplorer - Need help
I'm Anuj, a regular user of SEOMOZ. I need some SEO guidance from SEO experts. I'm trying to optimize a webstore for few keywords. I am facing some issues on SEO I was using https all over the webstore and was advised by the community members to not have https through out the site (Due to various reasons). The internal links were not showing up in opensiteexplorer & Google Webmaster Tools too when the site was with https (They were just showing 1 or 2). After changing the pages from https to http, I'm now able to see all the internal links of my website on GWT. Unfortunately, the internal link count on opensiteexplorer shows a very small fraction when compared to the # of internal links shown on GWT. The link update from Opensiteexplorer was on 27th FEB 2013. I had done the https to http (for all pages) somewhere between 17-24th of JANUARY 2013. I wanted to know if I have missed something as I am unable to see those numbers on Opensiteexplorer or will it take time for opensiteexplorer to show the internal link numbers ?
On-Page Optimization | | Pepperjet0 -
Are pages with lots of pictures with outbound links bad for seo?
"Inspiration" blog posts are a good example. The one below has 80 pictures as part of a logo inspiration page. But each picture has an outbound url. Bad or doesn't matter? http://www.hongkiat.com/blog/80-creative-and-well-designed-logos/
On-Page Optimization | | seo_f20120 -
How do I Avoid Excessive Internal Links on an eCommerce site?
I think I'm getting dinged for this on Term Target because the page is full of products, which have links to their product page, but I'm not sure.
On-Page Optimization | | PageLogic0 -
Max # of recommended links per page?
I've heard it said that Google may choose to stop following links after the first 100 on a page. The landing/category pages for my site's product catalog have earned quite a respectable PR and positioning in search results, and I'm currently paginating their product listings (about 200 products in a category) so that only a couple dozen products are shown on the first page, with links to "next page" and "previous page" being accomplished via query string (i.e. "?page=3"). An alternative option I have is to link to 100% of the contained products within the category's landing page (which would increase my on-page link count to ~300) and use CSS/Javascript to allow the user to simulate browsing between pages on the client side. My goal is to see as many of my product pages indexed as possible. Is this done better using my current scheme (where Googlebot would have to navigate to, say, Landing Page -> Page 6 -> Deeply Buried Product Page) or in the alternative method above, where all the links are in a single page? Since my landing pages are currently treated pretty well by search engines, would that "trust" cause them to follow more links than might normally be done? Thank you!
On-Page Optimization | | cadenzajon0 -
Should I put a No follow on each link in a Javascript dropdown menu?
I have a javascript dropdown menu on every page of my site. It lists all the wineries I write about and sell. About 300 links. I've been told that google doesn't like so many links on a page, but that it doesn't spider javascrpt. Then I hear that it does. Am I being penalized by all the links? Or does the spider really not see them? I don't want to give up my javascript menus, unless I have to. Should I put a no follow on each link inside the code? And on the other hand, am I losing google juice by not letting it see all the pages on my site that I link to in the javascript menu? Thanks in advance for your help!
On-Page Optimization | | JeanYates0