Looking for recent bad SEO / black hat example such as JC Penney example from 2011
-
I am giving a presentation in a few weeks and looking for a "what not to do" larger brand example that made poor SEO choices to try and game Google with black hat tactics. Any examples you can point me to?
-
An article we wrote a while ago with a few examples. It's my site but answers your question:
http://buildforsearch.com/googles-worst-penalties-and-what-happened/
-
Overstock offering discounts for edu links: http://searchenginewatch.com/article/2067445/The-Difference-Between-Black-Hat-SEO-and-Bad-SEO.
**From Article: "**Just this week, Overstock was penalized for extending discounts of up to 10 percent to universities/students/faculty who wrote posts that "happen to contain keywords in the anchor text of links." Not only is this against Google's guidelines, but it's also against the Federal Trade Commission's Guidelines; this link will take you to an article regarding a company which is helping with FTC disclosure issues."
-
You might like to take a look at this: http://www.wired.co.uk/news/archive/2013-02/27/interflora-disappears-from-google / http://searchengineland.com/interflora-gets-google-rankings-back-150366
-
It is not one company, but there is always the payday loan update where google targeted a whole industries. http://searchengineland.com/google-pay-day-loan-algorithm-google-search-algorithm-update-to-target-spammy-queries-162941
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Flurry of thousands of bad links from 3 Spammy websites. Disavow?
I also discovered that a website www.prlog.ru put 32 links to my website. It is a russian site. It has a 32% spam score. Is that high? I think I need to disavow. Another spammy website link has spam score of 16% with with several thousand links. I added one link to the site medexplorer.com 6 years ago and it was fine. Now it has thousands of links. Should I disavow all three?
White Hat / Black Hat SEO | | Boodreaux0 -
Which is Important? Backlinks or Internal Links? For SEO purpose.
Which is Important? Backlinks or Internal Links? For SEO purpose.
White Hat / Black Hat SEO | | BBT-Digital0 -
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Press Release SEO. More than 90 press released in prweb. How good for seo
One of a competition sites known to us has published 90 press releases of all upcoming car models with PRWEB, PR7, DA 97 in last 4 months. All with different anchor based keywords and links. The page authority too is very high in some of them around 70 Though they might have had spent a sum, does this really influence from SEO perspective. If yes, can 10 press releases with PR Newswire which has PR8 and DA as 95 can be good if we consider doing this with all unique anchor text & links
White Hat / Black Hat SEO | | Modi0 -
Has anyone used this? www.linkdetox.com/
Has anyone used this? www.linkdetox.com/ Any opinions about it?
White Hat / Black Hat SEO | | Llanero0 -
Tools to check Google Local SEO with suggestions.
Is there any tool for to check website position on Google maps ?? and also what is the way to check that a website is listed on which local directories and on which not listed and to get suggestions for improvements ?? so need Tools to check Google Local SEO with suggestions.
White Hat / Black Hat SEO | | mnkpso0 -
Question about local SEO when you serve many more cities than you have brick and mortar locations
My URL is: http://www.mollysmusic.org for the record.I run a music school that serves in-home lessons to a whole slew of cities. Since I only have 3 brick-and-mortar locations, I can't make google local profiles for all the cities served, but I want to get seen by those people searching in their own cities. Right now, our biggest competitor, takelessons.com, is top ranked for every single city you can think of, because they have individual web pages for every city served. Their content is repetitive and scrapey, and to me, that says "doorway page" which supposedly can get you de-indexed. I'm reluctant to do that because I'm afraid I'll get banned, but I have to compete. I also want a strategy that can scale when we move into new areas. Is there something that makes TakeLessons's content NOT a doorway page? What's the best practice for getting ranked in multiple individual cities if you run a service? Thanks in advance.
White Hat / Black Hat SEO | | mollysmusic0