Penalty for using expired domain?
-
I was wondering if anyone has any experience using dropped/expired domains with old "clean" backlinks for new sites.
-
Is there be a penalty for doing this (with good intent)?
-
Worth a reconsideration request?
-
-
Hi! I'm following up on some older questions. Did you buy any of these older domains? Do you have any information you could share with us?
-
It should be fine unless that site was banned in the past for some reason, though that's pretty easy to check. If your new website changes substantially in topic Google will likely recognise the change and not factor in domain's history. This is from what Google said at one point, not sure if anything changed since then.
-
Thanks for your answer. What about using the expired domain as a new site. Not for linking to other sites or 301 redirect. Could there be some kind of penalty? I guess I can just test it my self
-
If this is your only way of link building (especially if you do 301 only) it will start looking suspicious, your site may get penalised on manual review. I am not that sure if Google have algorithmic trigger for this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain authority a better metric then referring domain count?
Hi Guys, When reviewing competitors what would be a better metric - Referring domain count OR domain authority. From my understanding DA is a indication of the quality of the link profile. So if a site has a high DA this is a better metric for comparison then referring domain count. What are your thoughts on this? Cheers/
Intermediate & Advanced SEO | | cathywix0 -
Handling alternate domains
Hi guys, We're noticing a few alternate hostnames for a website rearing their ugly heads in search results and I was wondering how everyone else handles them. For example, we've seen: alt-www.(domain).com test.(domain).com uat.(domain).com We're looking to ensure that these versions all canonical to their live page equivalent and we're adding meta robots noindex nofollow to all pages as an initial measure. Would you recommend a robots.txt crawler exclusion to these too? All feedback welcome! Cheers, Sean
Intermediate & Advanced SEO | | seanginnaw0 -
Strategies for best use of competitors expired domain
I recently bought an old competitors expired domain that was ranking around the page 2 or 3 on Google for most keywords that I target. Curious as to best strategy for utilizing this domain: 1. set up some content with back links to my own domain
Intermediate & Advanced SEO | | IsaCleanse
2. Set up redirects to set up all of the competitors old domain URLs to corresponding sections on my website
3. Something else?0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
Two pages on same domain - Is this a proper use of the canonical tag?
I have a domain with two pages in question--one is an article with 2,000 words and the other is a FAQ with 300 words. The 300 word FAQ is copied, word-for-word and pasted inside of the 2,000 word article. Would it be a proper use of the canonical tag to point the smaller, 300 word FAQ at the 2,000 word article? Since the 300 word article is identical to a portion of the 2,000 word article, will Google see this as duplicate content? Thanks in advance for any helpful insight.
Intermediate & Advanced SEO | | andrewv0 -
Using the right Schema.org - & is there a penalty in using the wrong one?
Hi We have a set of reviewed products (in this case restaurants) that total an average rating of 4.0/5.0 from 800 odd reviews. We know to use schema/restaurant for individual restaurants we promote but what about for a list of cities, say restaurants in boston for example. For the product page containing all of Boston restaurants - should we use schema.org/restaurant (but its not 1 physical restaurant) or schema.org - product + agg review score? What do you do for your product listing pages? If we get it wrong, is there a penalty? Or this just simply up to us?
Intermediate & Advanced SEO | | xoffie1 -
Can I use a "no index, follow" command in a robot.txt file for a certain parameter on a domain?
I have a site that produces thousands of pages via file uploads. These pages are then linked to by users for others to download what they have uploaded. Naturally, the client has blocked the parameter which precedes these pages in an attempt to keep them from being indexed. What they did not consider, was they these pages are attracting hundreds of thousands of links that are not passing any authority to the main domain because they're being blocked in robots.txt Can I allow google to follow, but NOT index these pages via a robots.txt file --- or would this have to be done on a page by page basis?
Intermediate & Advanced SEO | | PapaRelevance0 -
The use of subdomains to improve SEO?
A clients website which provide a number of trade services which have a page for each service they provide for example: carpentry or electrician or plumbing etc. currently these pages are found at domain.co.uk/bathrooms/ bathrooms.html I am trying to optmise each page better as they are competing with other sites who for example sell bathrooms rather than bathroom installers or plumbers. As part of the on page optimisation I plan to change the page names and directory structure. I had an idea to split the website down into subdomains for various sections i.e for all their services Create a sub domain such as http://plumber.domain.co.uk 2.) upload the relevant content (in this example the plumbing page) to the sub domain location 3.) correct all the links to absolute URLs for each sub domain / Will this help target better use of keywords in the URL in terms of SEO efforts ? hope it makes sense thanks Darren
Intermediate & Advanced SEO | | Bristolweb0