Still a good idea to have a niche store?
-
I have had a niche store since 2008. It has a higher conversion rate than my main site, and does fairly well.
The site template is an exact copy of my main site, and all the product URLs are the same. The only difference is, it only has items from a certain category.
My questions is, are niche store still a good idea?
Will I be better of doing a page to page 301 to my main site, and just focus on a single site?
Am I better of as far as SEO with one main site?
If anyone has experience with this, I would like to hear about your experience. Any thoughts are appreciated.
-
If your niche site is working better for you right now there seems little point in 301-ing it to your main site. Not only do you risk losing rankings (as your main site may not rank for the terms your niche site used to), you've already said your main site doesn't convert as well - so even if you get the main site ranking in place of the niche site, you'll still likely lose sales.
If I were you I'd keep the two sites for now, as Gary says don't interlink them and try to make sure the content isn't exactly the same.
However, I would encourage you to consider what you might want to do in the future.
Where do you want to invest your time and focus your attention? Do you want to improve your main site, or your niche site? Continuing to invest in both (i.e. dividing your resources) may mean you're left with two sites which are just 'ok' rather than one really great site which might not be the smartest move long term.
-
Without knowing a lot more....
The current state of Google would suggest you keep both sites. Do you interlink them? If so DON'T.
It is easier to rank for niche results with a niche site, however you will have a more powerful site overall with once main site that you can focus all your time on.
Is it possible to rewrite the product descriptions for one of the sites? This make the site more powerful and independent from the other?
Anyone at Google would say combine them, but a penalty is just one small mistake away from you having no business over night. Keep them both (re-write the content)
The fact that you have higher conversion rates on the niche show you that it is easier to rank and easier to show your customers you are an expert in the field. I would not mess with that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Custom Search Engine: Good Idea?
I created a Google Custom Search Engine for our site, but I"m not sure implementing it is a good idea. When I tested it with the public URL, I noticed that ads show up on the search engine that could potentially move visitors away from our site to our competitors. Has anyone had success with implementing a Google Custom Search Engine? Do the pros outweigh the cons? Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Our Journey back to Good Rankings.
17 year old support site on the topic of hair loss. The home page (and pretty much all internal pages) enjoyed Page 1 Place 1 ranking out of 64 million search results for 12 of those years, for our main search phrase: hair loss. Other internal pages ranked #1 for other search phrases. I believe we were blessed by Google because we did everything the best we could: Genuine, manually constructed, unique, relevant content that was created from the heart. Other generalized health sites linked-to our site for more information on hair loss, and we had a couple thousand back-links that we never had to pay for. For the last 7 years or so, core content and news center went stagnant, but user-driven content (discussion forums) continued chugging along. Very old CMS systems had created duplicate content (print pages, PDF pages, share pages) and the site was not mobile-friendly at all. By the end of 2013, our home page had been bumped to the middle of Page 2 for "hair loss" as Google began pushing us down. Replacing our 700 page site dedicated to the topic of hair loss with random news articles, and dermatology organization sites that had little more than a paragraph of content on the topic. Traffic and income dropped by over 75% with this change, and by 2015 we were looking at a 9 year old site design that wasn't mobile-friendly, and had no updated content outside of the Forums for about as long. Mid 2015 we began a frantic renovation. The store was converted to a mobile-friendly design, tossed into HTTPS, and our developer screwed up, forgetting to put canonicals in place. Soon after, our store rankings dropped to almost zero. By the end of 2015 this was fixed, and we were spending tens of thousands to convert a very large, very old site into WordPress with a responsive, mobile friendly, lightning fast page-load design. We had no Google Analytics data prior to this either. Actions Taken starting Jan 1, 2016 - May 2016: Static Homepage + core content > Now put into WordPress. (80 pages) - proper 301's. News section running a 10 year old "PostNuke" CMS > Now put into WordPress. (300 pages). 301's. Forums running a 5 year old vBulletin > Now put into XenForo. (160,000 pages). 301's. Profiles section running a 10 year old "SocialEngine" CMS > Now put into new SocialEngine. (10,000 pages)* Site moved from HTTP > HTTPS. Proper 301's. Store CMS already finished months prior but sales dropped by 90%. Almost zero. Old forum CMS had created countless duplicate URLs. All of these 410'd. Old forum CMS had 65,000 pointless member profile pages indexed. All 410'd. Old news CMS created 4+ dup pages for every article (print, etc). All 301'd to new Article URL. Our HTACCESS file is thousands of lines long, trying to clean everything up, and redirect everything back to one, accurate, proper URL for each piece of content. It was a lot of work! After 17 years, we obviously had spammy sites linking to us. I quickly deleted content on my site the worst offenders were linking to. Then hired an SEO person to create a disavow audit on the other 20,000 sites liking to us. He settled on around 300 URLs needing disavow, but commented that didn't see any evidence we'd been penalized by Panda. He finished Friday and we will submit disavow Monday. Ran Screaming Frog audit on the site Cleaned up Google Search Console fully Created properties and submitted new sitemaps there. Monitored each property for the last 3 months and addressed 100% of issues raised. Revived Facebook, Twitter, Google+, Pinterest, and Instagram Accounts. Began publishing new content in our /news/ section and cross-posting to Social Media. Began improving up our Title Tags in the Forums as they often were pointless: "Hi! Need help!?" **Despite this, nothing has helped. Nothing has budged. Our traffic hasn't moved an inch since January. Sales have dropped 90% and site income has almost dried up. ** I have taken out a $25,000 personal loan just to cover my mortgage and pay my bills while I attempt to identify what's going wrong, and how to fix it. It bought me about 3 months, and that 3 months is almost up. I hired 2 or 3 different SEO experts with varying levels of experience. Due to no Google Analytics data to draw on, none of them could come up with any specific explanations for our drop in ranking over the last 4 years. That's why I took the approach to just "do everything" to fix all problems identified, and then cross my fingers. It hasn't worked. As of today our home page is not even found in google for our main search phrase: hair loss. Its simply not there. At all. And the only thing that is ranking is our forums, ranked at "67", which is horrible. But I don't understand why a site that was doing so well for over a decade has now been completely dropped from Google, without a single notice in Console or otherwise, explaining any problems. I realize this is a massive undertaking, and an equally massive post. But any time you can spend helping me will be forever appreciated.
Algorithm Updates | | HLTalk0 -
Need Advice - Google Still Not Ranking
Hi Team - I really need some expert level advice on an issue I'm seeing with our site in Google. Here's the current status. We launched our website and app on the last week of November in 2014 (soft launch): http://goo.gl/Wnrqrq When we launched we were not showing up for any targeted keywords, long tailed included, even the title of our site in quotes. We ranked for our name only, and even that wasn't #1. Over time we were able to build up some rankings, although they were very low (120 - 140). Yesterday, we're back to not ranking for any keywords. Here's the history: While developing our app, and before I took over the site, the developer used a thin affiliate site to gather data and run a beta app over the course of 1 - 2 years. Upon taking on the site and moving to launch the new website/app I discovered what had been run under the domain. Since than the old site has been completely removed and rebuild, with all associated urls (.uk, .net, etc...) and subdomains shutdown. I've allowed all the old spammy pages (thousands of them to 404). We've disavowed the old domains (.net, .uk that were sending a ton of links to this), along with some links that seemed a little spammy that were pointing to our domain. There are no manual actions or messaged in Google Webmaster Tools. The new website uses (SSL) https for the entire site, it scores a 98 / 100 for a mobile usability (we beat our competitors on Google's PageSpeed Tool), it has been moved to a business level hosting service, 301's are correctly setup, added terms and conditions, have all our social profiles linked, linked WMT/Analytics/YouTube, started some Adwords, use rel="canonical", all the SEO 101 stuff ++. When I run the page through the moz tool for a specific keyword we score an A. When I did a crawl test everything came back looking good. We also pass using other tools. Google WMT, shows no html issues. We rank well on Bing, Yahoo and DuckDuckGo. However, for some reason Google will not rank the site, and since there is no manual action I have no course of action to submit a reconsideration request. From an advanced stance, should we bail on this domain, and move to the .co domain (that we own, but hasn't been used before)? If we 301 this domain over, since all our marketing is pointed to .com will this issue follow us? I see a lot of conflicting information on algorithmic issues following domains. Some say they do, some say they don't, some say they do since a lot of times people don't fix the issue. However, this is a brand new site, and we're following all of Google's rules. I suspect there is an algorithmic penalty (action) against the domain because of the old thin affiliate site that was used for the beta and data gathering app. Are we stuck till Google does an update? What's the deal with moving us up, than removing again? Thoughts, suggestions??? I purposely, did a short url to leave out the company name, please respect that, since I don't want our issues to popup on a web search. 🙂
Algorithm Updates | | get4it0 -
Penguin Strategy Idea
Hello, So while I sit and wait for Penguin to hopefully ease up on my site, would it be smart to create a new site and then forward the old penguin hit site to it after the dust clears and any issues are better? I have heard some people say you never recover from an algo penalty? While others say to wait it out. So just curious what you think? Would you ditch a site that is not super followed or popular? in the attempt to create a site with no bad backlinks and start fresh? Thank you for your thoughts
Algorithm Updates | | Berner0 -
Does anyone have an idea of the benefits of Google Analytics Premium?
We've been having a discussion about the GA Premium service here in our office, trying to weigh up the pro's and con's... For the majority all it seems you gain access to is more support from google. We're trying to find out if that is the case or if you gain extra information, such as and insight into the search terms who must not be named. Of course i'm talking about the (Not Set) data... This section of data is ever increasing, yes i know we can access certain terms through webmasters but it was so much easier (in the good ol' days) when all the data was under one roof! Any thought opinions or even more questions would be greatly appreciated, i look forward to your responses. Anthony
Algorithm Updates | | Kal-SEO0 -
Have I been Hit by a Penguin? No Warning in Webmaster / Some Pages still Rank
Hi all, I have recently signed up to MOZ as I have seen a large drop in the turnover of a site I work with as well as a slump in visitors. I know part of this slump is the transition from google product search from being free to paid and chewing through our adwords budget quicker. The other part though seems a little more tricky, I have always been under the impression from reading online that an algorithm update would see a site destroyed for most terms and a notification generated in webmaster tools, however the site still seems to still rank for some terms, others however it has fallen off the face of the earth for. As you can see in the attachment webmaster tools is showing much decreased visibility, and MOZ agrees with this. Key terms that have lost rank have done so by around 4-10 positions. The content on the site has all been hand written by myself, however some of the pages are a little "stale" so I am currently running through re-writing every product page on the site (1000 products or so) all my product pages grade a minimum B with 99% A on the Moz page grader. I am keeping my fingers crossed that fresh content should assist in getting google interested again? However my real questions is, Is this Penguin? or is this just stale content? dmDdMr5.jpg pYkzck0.jpg 9f4mgM9.jpg
Algorithm Updates | | speedingorange1 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0