Failed microsites that negatively affect main site: should I just redirect them all?
-
While they are great domain names, I suspect my 7 microsites are considered spammy and resulted in a filter on my main e-commerce site for the important keywords we now have a filter blocking from showing up in search. Should I consider it a sunk cost and redirect them all to my main e-commerce site, or is there any reason why that would make things worse? I've fixed just about everything I can thinking of in response to Panda and Penguin, before which we were on the first page for everything. That includes adding hundreds of pages of unique and relevant content, in the form of buyers guides and on e-commerce category pages -- resolving issues of thin content. Then I hid URL parameters in Ajax, sped up the site significantly, started generating new links... nothing... I have tons of new keywords for other categories, but I still clearly have that filter on those few important head keywords. The anchor text on the microsites leading to the main site are typically not exact match, so I don't think that's the issue. It has to be that the sites themselves are considered spammy. My bosses are not going to like the idea because they paid for those awesome domains, but would the best idea be to redirect them to the e-commerce site?
-
Thanks for the insight everyone!
I was also thinking about the possibility of canonicalizing them to pages on the ecommerce site, but am afraid that might have the same detrimental effect as redirecting them.
@Nakul and Todd: The content is not horrible, but they are basically set up as blog rolls. I don't think anyone particularly "likes" the content nor do they share it. For the most part, they are product reviews and announcements from our manufacturers, and I have stopped adding content to the sites because it seemed like a waste, when I could be generating new and better content for the actual e-commerce site and our real blog. That said, there is a tremendous amount of content on these sites from the last 4 years. It was apparently working very well for the company, but not after panda and penguin. Some of the domains are exact matches for our head keywords (that we lost rankings for), others are exact matches for product titles or model numbers. I don't think there was ever an unnatural links warning from Google, but I've seen sites not get a message in GWT but still clearly be penalized for it.
@Moosa: Both. Failed in terms of not generating conversions (or possibly generating a few here and there), but relative to the likely negative effect, I'm just not sure how to handle this. They are our sites, so dissavowing them wouldn't do much good. I could go through and manually remove links and canonicalize them, but I'm wondering if it's better to just take the sites down...
-
Now when you say failed micro sites, does that means failed in terms of content or converting users in to customers or failed because they have spammy links on their profile and because they get a hit from panda or penguin?
If they get a hit from panda or penguin then it is the worst idea to redirect the micro site to main site as they will pass their link juices to the main site (which makes the situation go worst)!
In-case they are failed it terms of gaining SERP rankings and converting users in to customers in that case you can redirect it to main site...
If redirection is the only option you have then in that case try to clean their link profile by sending link removal emails and using a link disavow tool and then move towards redirection.
Hope this helps!
-
I agree with Nakul - if they do not have great content and great inbound natural links, they can potentially do harm to your main website.
It is worth extensively evaluating the situation before proceeding.
-
Do they have great content ? Do people like the content ? Do actual users share or interact with your content ? Is it ideation of your e-commerce products ? Or is it more like information about your products ? Does it add value to your customer ? How much content do you have on your microsites ? Are they same/similar topics ?
It depends case to case, but you could indeed consider doing what you are thinking based on the answers above. Just make sure it adds value.
Also, have you ever received a penalty or unnatural links warning ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Do you see sites with unfixable Penguin penalties?
Hello, We have a site with 2 Penguin update penalties (drops in traffic) and one quality penalty (another drop in traffic) all years ago, both just drops in rankings and not messages in Google Console. Now that Penguin is hard coded, do you find that some sites never recover even with a beautiful disavow and cleanup? We've added content and still have some quality errors, though I thought they were minor. This client used to have doorway sites and paid links, but now is squeaky clean with a disavow done a month ago though most of the cleanup was done by deletion of the doorways and paid links 9 months ago. Is this a quality problem or is our site permanently gone? Let me know what information you need. Looking for people with a lot of experience with other sites and Penguin. Thanks.
White Hat / Black Hat SEO | | BobGW2 -
Does type of hosting affect SEO rankings?
Hello, I was wondering if hosting on shared, versus VPS, versus dedicated ... matter at all in terms of the rankings of Web sites ... given that all other factors would be exactly equal. I know this is a big question with many variables, but mainly I am wondering if, for example, it is more the risk of resource usage which may take a site down if too much traffic and therefore make it un-crawlable if it happens at the moment that a bot is trying to index the site (factoring out the UX of a downed site). Any and all comments are greatly appreciated! Best regards,
White Hat / Black Hat SEO | | uworlds
Mark0 -
How to make second site in same niche and do white hat SEO
Hello, As much as we would like, there's a possibility that our site will never recover from it's Google penalties. Our team has decided to launch a new site in the same niche. What do we need to do so that Google will not mind us having 2 sites in the same niche? (Menu differences, coding differences, content differences, etc.) We won't have duplicate content, but it's hard to make the sites not similar. Thanks
White Hat / Black Hat SEO | | BobGW0 -
Google admits it can take up to a year to refresh/recover your site after it is revoked from Penguin!
I found myself in an impossible situation where I was getting information from various people that seem to be "know it all's" but everything in my heart was telling me they were wrong when it came to the issues my site was having. I have been on a few Google Webmaster Hangouts and found many answers to questions I thought had caused my Penguin Penalty. After taking much of the advice, I submitted my Reconsideration Request for the 9th time (might have been more) and finally got the "revoke" I was waiting for on the 28th of MAY. What was frustrating was on May 22nd there was a Penguin refresh. This as far as I knew was what was needed to get your site back up in the organic SERPS. My Disavow had been submitted in February and only had a handful of links missing between this time and the time we received the revoke. We patiently waited for the next penguin refresh with the surety that we were heading in the right direction by John Mueller from Google (btw.. John is a great guy and really tries to help where he can). The next update came on October 4th and our rankings actually got worse! I spoke with John and he was a little surprised but did not go into any detail. At this point you have to start to wonder WHAT exactly is wrong with the website. Is this where I should rank? Is there a much deeper Panda issue. We were on the verge of removing almost all content from the site or even changing domains despite the fact that it was our brand name. I then created a tool that checked the dates of every last cached date of each link we had in our disavow file. The thought process was that Google had not re-crawled all the links and so they were not factored into the last refresh. This proved to be incorrect,all the links had been re-cached August and September. Nothing earlier than that,which would indicate a problem that they had not been cached in time. i spoke to many so called experts who all said the issue was that we had very few good links left,content issues etc.. Blah Blah Blah, heard it all before and been in this game since the late 90's, the site could not rank this badly unless there was an actual penalty as spam site ranked above us for most of our keywords. So just as we were about to demolish the site I asked John Mueller one more time if he could take a look at the site, this time he actually took the time to investigate,which was very kind of him. he came back to me in a Google Hangout in late December, what he said to me was both disturbing and a relief at the same time. the site STILL had a penguin penalty despite the disavow file being submitted in February over 10 months ago! And the revoke in May. I wrote this to give everyone here that has an authoritative site or just an old one, hope that not all is lots just yet if you are still waiting to recover in Google. My site is 10 years old and is one of the leaders in its industry. Sites that are only a few years old and have had unnatural link building penalties have recovered much faster in this industry which I find ridiculous as most of the time the older authoritative sites are the big trustworthy brands. This explains why Google SERPS have been so poor for the last year. The big sites take much longer to recover from penalties letting the smaller lest trustworthy sites prevail. I hope to see my site recover in the next Penguin refresh with the comfort of knowing that my site currently is still being held back by the Google Penguin Penalty refresh situation. Please feel free to comment below on anything you think is relevant.
White Hat / Black Hat SEO | | gazzerman10 -
Do some sites get preference over others by Google just because? Grandfathered theory
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Can a "Trusted Retailer" badge scheme affect us in the SERPs?
Hi Guys, In the last week our website saw a drop on some of our biggest and best converting keywords and we think it might be down to us rolling out a “Trusted Retailer” badge scheme. We sell our products directly to consumers via our website, but we also sell our products to other online resellers. We think badges are a good to show the consumer that we trust a site. On the 17th September we sent out badges to about 39 of our best retailers, two of whom have already put them on their sites. Instead of sending them a flat jpeg, we sent them HTML files containing code that pulled in the image from our servers. We wanted to host the image to make sure that we always had some leverage. So if a company stopped selling our products, or the quality of their site went down, we could just remove the badge. Whilst at it, we stuck a link in there pointing to an FAQ on our website all about trusted retailers and what people need to look out for. We chose the anchor text “(brand name) Trusted Retailer”, because that seemed to be the most relevant. The code looks like this: (our brand) Trusted Retailer You might notice that there is a div just before the link. This is there to stop the user from clicking on the top 65% of the badge (because this contains the shop name and ID number), and we also used a negative text-indent to move the anchor text out of the way. But right underneath this is our Logo, so it’s almost a hidden link, but you can still click it. So far the badge has been put in on two sites, one of which isn’t so great and maybe looks a tiny bit spammy. (They sell mostly through ebay as opposed to on their main site). Also, these sites seem to have put it on most of their pages! So my questions are; Is this seen as black or grey hat? Is it the fact we put in anchor text with our brand? Or is it the fact the url is transparent in the coding? Or is it the fact the sites are using sitewide links? In any case would Google react so quickly as to penalise us in two days? If this is the issue, do you think there’s anything we can do to stop getting penalised? (Other than having to e-mail 39 retailers back and getting them to take the badges down). Thoughts much appreciated – we do our SEO in-house and are still learning every day… Thank you James
White Hat / Black Hat SEO | | OptiBacUK0 -
Should this site be punished?
Every summer for the past 4 years one of our customer's competitors suddenly has a big jump in Google's (.co.uk) rankings for some of the main industry phrases, particularly "air conditioning". We were always under the impression that they bought links before the busy summer season, as they have these strange massive jumps in the rankings. (for the rest of the year they often drop down) I recently checked out some of the back-links going to their site and noticed something I'd not seen before. Of the (approx) 480 links that showed up, around 80% of the SourceURL's ended with "?Action=Webring" (see 1st attached image). To me it doesn't look natural at all and I'm surprised that Google hasn't picked up on. Their site is www.aircon247.com. It had been mentioned to me that this may be to do with link sharing sites (which I assume is black-hat) but I'm not 100% sure that they are doing this. They also have an identical long spammy-looking footer at the bottom of every page which is clearly only for search engines to see. We reported it to Google a year ago but no action was taken. Do you think that it is acceptable to have it on every page? (see 2nd attached image) I would be interested to know your thoughts on both of these, and whether this would be a dangerous tactic to try and emulate? Gc5MU.png iXGA9.png
White Hat / Black Hat SEO | | trickshotric0