Google and private networks?
-
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it.
I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip.
On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content.
So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it.
They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well.
Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
-
I won't be so sure that those expired domains are what's helping them rank. As their competitor, I'd more likely snicker at them behind their back for spending the money on buying and hosting those domains with the thought that it's helping their main site in the search results. Here's an old Danny Sullivan article on the topic
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Displaying Wrong Homepage URL
Hi Everyone, Google is displaying www.domain.com instead of domain.com. We have our preferred URL set up as domain.com, and even redirect www.domain.com to domain.com, but in the search results it is showing www.domain.com. Problem is we are seeing referral data from www.domain.com and in Google it says "No information is available for this page." Anyone seen a way to resolve this?
Intermediate & Advanced SEO | | vetofunk0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Google Places Drop
Hi everyone! I have a client that was ranking very nicely for a number of keywords. In the 5 pack for most of the keywords we were targeting. His account went under review for some unknown reason about 2 months ago. It disappeared from the listing... Then a few weeks ago it became approved again. He is now no longer ranking for any of those keywords. He is ranking for some obscure ones but the money words are gone. Do you think this was due to the review? Some sort of GP update over the last 60 days? All of my other clients are still ranking strong in Google Places. Any ideas?
Intermediate & Advanced SEO | | SeattleJoe0 -
Google Ranking Wrong Page
The company I work for started with a website targeting one city. Soon after I started SEO for them, they expanded to two cities. Optimization was challenging, but we managed to rank highly in both cities for our keywords. A year or so later, the company expanded to two new locations, so now 4 total. At the time, we realized it was going to be tough to rank any one page for four different cities, so our new SEO strategy was to break the website into 5 sections or minisites consisting of 4 city-targeted sites, and our original site which will now be branded as more of a national website. Our URL structures now look something like this:
Intermediate & Advanced SEO | | cpapciak
www.company.com
www.company.com/city-1
www.company.com/city-2
www.company.com/city-3
www.company.com.city-4 Now, in the present time, all is going well except for our original targeted city. The problem is that Google keeps ranking our original site (which is now national) instead of the new city-specific site we created. I realize that this is probably due to all of the past SEO we did optimizing for that city. My thoughts are that Google is confused as to which page to actually rank for this city's keyword terms and I was wondering if canonical tags would be a possible solution here, since the pages are about 95% identical. Anyone have any insight? I'd really appreciate it!0 -
Google Places Multiple Location
Hi everyone, I have a client with multiple locations in the same city. I would like to have their Goolge places listing show up under the main website listing. Currently, one of the Google places listings in being pulled in directly below the main website but not the other. The Zagat rating is being pulled in as well. I would like to have both locations show up when you type in the name of the business. Any ideas how to do this?
Intermediate & Advanced SEO | | SixTwoInteractive0 -
Does Google read texts when display=none?
Hi, In our e-commerce site on category pages we have pagination (i.e toshiba laptops page 1, page 2 etc.). We implement it with rel='next' and 'prev' etc. On the first page of each category we display a header with lots of text information. This header is removed on the following pages using display='none'. I wondered if since it is only a css display game google might still read it and consider duplicated content. Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Setting Up Google Analytics for domains with 301
I have a client with a google analytics account that is a mess. domaina.com domainb.com 302's to domaina.com domainc.com 302's to domaina.com domaind.com 302's to domaina.com I thought the client was doing 301s on all these domains to the primary domain. I have logged into there analytics account and found data is being tracked on the other domains i.e domainb,c,d.com etc. How is it possible that google analytics is tracking data on these domains when no analytics code has been created and the urls are redirecting to domaina.com? Also there are not sites on these domains so for webmaster tools should I enable domain verification through a cname on the dns? Also I can I best setup a way to track traffic coming from say domainb.com? Whats the best step by step guide to use to set this up.
Intermediate & Advanced SEO | | JohnW-UK0 -
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Intermediate & Advanced SEO | | NEWCRAFT0