Strategies for best use of competitors expired domain
-
I recently bought an old competitors expired domain that was ranking around the page 2 or 3 on Google for most keywords that I target.
Curious as to best strategy for utilizing this domain:
1. set up some content with back links to my own domain
2. Set up redirects to set up all of the competitors old domain URLs to corresponding sections on my website
3. Something else? -
Thank you both, finally getting around to doing this now!
-
Hi Sandi,
You make one page with a "press release" writing that the company is been taken over by (yourcompany.com). All the other URL's you redirect to this page. On this page you can link to the most important places you want on your own website.
This way the authority of the old competitors domain will be forwarded to yours. And after like 6 months/1year you could link the whole domain to your site.
As Thomas mentions below it is a good idea to check which links could be of use (https://moz.com/researchtools/ose/) and contact the most important to change the link domain to yours.
Hope it helps! Regards, Tymen
-
Hi Tymen! Thanks for your feedback sounds like a good idea! Could you please elaborate a bit what you mean as if you are trying to explain to someone that is a novice - ie "I would make a One-pager with a catch all URL's. This way all the old url's of the site will go to this page and you get no 404's. On this page you explain that there is nothing there anymore and you should go to your site. I would not put to many links on the page."
-
Expanding on what Tymen has said,
This could be a good strategy, but why would you not just 301 redirect the whole site to a page on your own site (explaining that they don't exist anymore). This way I see you getting more value to your site (one hop through the redirect instead of one hop from the redirect and the link)
Also, something that may be worth looking into is if they have any high value links, seeing where they come from and explaining the company is not existent and trying to get the links that they once had.
-
Hi Sandi,
I would make a One-pager with a catch all URL's. This way all the old url's of the site will go to this page and you get no 404's. On this page you explain that there is nothing there anymore and you should go to your site. I would not put to many links on the page.
Eventually the authority of the competitors site will go but if you have everything in place you will get it. What you could also do is login into the Seach Console account of the competitor and see which pages have good content. This content you could copy to your site before you take it offline.
Good luck!
Tymen
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Using disavow tool for 404s
Hey Community, Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp). It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question. Feel free to ask any questions that may help you understand the issue more. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Using Canonical Attribute
Hi All, I am hoping you can help me? We have recently migrated to the Umbraco CMS and now have duplicate versions of the same page showing on different URLs. My understanding is that this is one of the major reasons for the rel=canonical tag. So am I right in saying that if I add the following to the page that I want to rank then this will work? I'm just a little worried as I have read some horror stories of people implementing this attribute incorrectly and getting into trouble. Thank you in advance
Intermediate & Advanced SEO | | Creditsafe0 -
Domain Factors
Now that Page Rank seems to have been 'put out to graze' by Google with no further PR update planned, what would you say is the 'main factor' when looking at a domain? Is it Moz DA? or Moz Links? or Majestic TrustFlow? Or none of the above or is it a combination of the above?! Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Domain Name Redirect Question
My agency just built a new website for a client who is a franchisee. It's not launched yet - it's currently under an IP address. I suggested to client that he buy a keyword-rich domain name for it, which he did. Then he found out that the franchisor will not allow it to be his main domain name. They want him to use a domain name with the franchisor name in it. But they WILL allow him to put a 301 redirect on that franchisor-approved domain name, and redirect it to his keyword-rich domain name. He is interested in having my agency perform an SEO Campaign for this new website. But would SEO and link marketing work for a website that has a new non-keyword domain name that 301 redirects to a new keyword-rich domain name?
Intermediate & Advanced SEO | | netsites0 -
Use of Rel=Canonical
I have been pondering whether I am using this tag correctly or not. We have a custom solution which lays out products in the typical eCommerce style with plenty of tick box filters to further narrow down the view. When I last researched this it seemed like a good idea to implement rel=canonical to point all sub section pages at a 'view-all' page which returns all the products unfiltered for that given section. Normally pages are restricted down to 9 results per page with interface options to increase that. This combined with all the filters we offer creates many millions of possible page permutations and hence the need for the Canonical tag. I am concerned because our view-all pages get large, returning all of that section's product into one place.If I pointed the view-all page at say the first page of x results would that defeat the object of the view-all suggestion that Google made a few years back as it would require further crawling to get at all the data? Alternatively as these pages are just product listings, would NoIndex be a better route to go given that its unlikely they will get much love in Google anyway?
Intermediate & Advanced SEO | | motiv80 -
What strategies to best use to boost rankings across long-tail articles on site?
Heya! I'm currently engaged in what appears to be a slightly unusual SEO task. I run a large, reasonably well-respected (but not global-standard, yet) site that I'm currently monetising through individual articles targetted at addressing specific search engine queries that I know have decent traffic. It's the EHow / Demand Media model, except with a focus on a single specific (video games) niche, and much, much better quality articles (sufficiently good that they attract a fair amount of praise - all the writers on the site are published authors and the quality's damn high). Most of our articles end up ranking with essentially no backup, but they don't rank high - usually 2nd or 3rd page of Google. I'm trying to determine what the most effective strategy would be for us to boost our article rankings with the least possible expense / effort (we don't have a huge budget). Our long-tail articles are mostly being trumped by articles with either a couple of external links to them or by other articles with no links but from a site with significantly higher Domain Authority (70+ to our 48).I'm working to improve our on-page optimisation, but it's already pretty good (an "A" report from the SEOMoz tools on most or all pages). So, I'm wondering what the best use of our time would be to increase traffic globally across the site. Strategies I'm considering: Focussing on building links to the homepage and to any other pages on the site, by asking for links from community members, doing linkbait articles, directory submissions, guest blogging, and so on. Long-term aim: increase our domain-wide MozRank and MozTrust. Build links to our long-tail articles specifically, most popular first. Get direct links from relevant blogs, press releases, social bookmarking, etc. Long-term aim: get to #1 on Google one page at a time. Something Else? I'm wondering what the big SEO brains here would suggest? Happy to provide additional details if it would help!
Intermediate & Advanced SEO | | Cairmen1 -
Best strategy behind moving country subdirectory to dedicated TTLD wo/ loosing organic search volume?
Community, We are about to move one of our most popular country sub directories from brandname.com/de/.. to brandname.de . We have just purchased the domain so while the domain has been registered in 2009 the URL has zero domain authority. What is the best strategy to execute the move while being cautious about loosing too much organic search volume the subdirectory is receiving right now? Obviously it will take some time to build up DA on the TTLD so maybe it is a good idea to keep the country directory for a little longer and start on the TTLD with just a static landing page, place some links, wait until it receives some DA builds up and then perform the move. Thoughts? /TomyPro
Intermediate & Advanced SEO | | tomypro0