Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to make second site in same niche and do white hat SEO
-
Hello,
As much as we would like, there's a possibility that our site will never recover from it's Google penalties.
Our team has decided to launch a new site in the same niche.
What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.)
We won't have duplicate content, but it's hard to make the sites not similar.
Thanks
-
I'm sorry to hear that I would recommend requesting the people linking to your existing site that are using high quality powerful links to update the back link to point to your new site.
the advantages of dealing with people with legitimate sites are they are much easier to find and will actually help you with these types of things. It's not the nightmare that it is trying to get a hold of a blackhat webmaster.
Outside of creating a 100% legitimate website with a slightly different niche may be content, inbound marketing whatever buzzword you want to use for the very short time I hope it takes you to get your most powerful white hat links to point to your new website.
Removeem.com It is a wonderful tool for finding the names and contact info of webmasters you can use it to make a polite request saying that you have a new domain and you would appreciate if they would please update the link pointing at your site.
After you have taken the best Backlinks away from your existing site I would move to your new site.
I would also be upfront about moving place text saying you are changing domain names in a conspicuous location on your site.
If you feel that your livelihood is being jeopardized by this I definitely can understand I would then really put 110% into creating some top-notch content and user friendly/mobile design on your new brand. When you go live you want to really have something better than what you had before.
I'm sorry I don't know any methods that would be instant but I would consider using pay per click to soften the blow.
I hope this is of help,
Thomas
-
Tom,
I appreciate the responses and they make sense. I don't see a solution. I don't see our current site ever pulling out of penalty no matter what I do and we've got an income off of it.
Any ideas?
-
this is older but
http://googlewebmastercentral.blogspot.com/2010/11/best-practices-for-running-multiple.html
https://www.webmasterworld.com/google/4557285.htm
and this discussion of tactics used to do what are considered now black hat
http://www.nichepursuits.com/should-you-host-all-your-niche-sites-on-the-same-hosting-account/
it is no ok in Google ad words either
sorry for all the posts,
Tom
-
with all that said I think if you go after a slightly new niche or offer things from a different angle you're obviously doing twice the work.
Are you concerned that if you 301 redirect you will be bringing the penalty over?
sincerely,
Tom
-
talking about taking the new site and building it using white hat tactics that were implemented after the penalty in which the original site has yet to return from. I know that creating sites that are essentially going to be the same but contain unique content just to get better rankings is against the rules.
if you remove the first site after building the first site using white hat methods currently employed on the existing site
( I should say domain because that's what's coming down to right?)
it would be in your best interest to remove the first site when the second website goes live.
I know this is not the ideal situation because you probably have some good Backlinks on the original but having two sites that are competing for the same niche owned by the same person/company would be competing for the same place in the SERPS I believe would be considered a method of rigging the system.
if you have one site that is completely fine if you have one that is going to go after different niche that is completely fine.
I am basing this on an e-commerce client of mine who had competitor selling the exact same product with unique content across three domains.
The client reported this to Google and the spam team acted or there was an incredible coincidence because two months later sites reported could not be found in Google's index.
I that is of help,
Tom will
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
White H1 Tag Hurting SEO?
Hi, We're having an issue with a client not wanting the H1 tag to display on their site and using an image of their logo instead. We made the H1 tag white (did not deliberately hide with CSS) and i just read an article where this is considered black hat SEO. https://www.websitemagazine.com/blog/16-faqs-of-seo The only reason we want to hide it is because it looks redundant appearing there along with the brand name logo. Does anyone have any suggestions? Would putting the brand logo image inside of an H1 tag be ok? Thanks for the help
White Hat / Black Hat SEO | | AliMac261 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Best URL structure for SEO for Malaysian/Singapore site on .com.au domain
Hi there I know ideally i need a .my or .sg domain, however i dont have time to do this in the interim so what would be the best way to host Malaysian content on a www.domainname.com.au website? www.domainname.com.au/en-MY
White Hat / Black Hat SEO | | IsaCleanse
www.domainname.com.au/MY
domainname.com.au/malaysia
malaysia.domainname.com.au
my.domainname.com.au Im assuming this cant make the .com.au site look spammy but thought I'd ask just to be safe? Thanks in advance! 🙂0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Black Hat SEO Case Study - Private Link Network - How is this still working?
I have been studying my competitor's link building strategies and one guy (affiliate) in particular really caught my attention. He has been using a strategy that has been working really well for the past six months or so. How well? He owns about 80% of search results for highly competitive keywords, in multiple industries, that add up to about 200,000 searches per month in total. As far as I can tell it's a private link network. Using Ahref and Open Site Explorer, I found out that he owns 1000s of bought domains, all linking to his sites. Recently, all he's been doing is essentially buying high pr domains, redesigning the site and adding new content to rank for his keywords. I reported his link-wheel scheme to Google and posted a message on the webmaster forum - no luck there. So I'm wondering how is he getting away with this? Isn't Google's algorithm sophisticated enough to catch something as obvious as this? Everyone preaches about White Hat SEO, but how can honest marketers/SEOs compete with guys like him? Any thoughts would be very helpful. I can include some of the reports I've gathered if anyone is interested to study this further. thanks!
White Hat / Black Hat SEO | | howardd0