Call for Help. Hit Badly with "Medic" and another 30% Loss with Sept 28th Update
-
Hi Everyone,
I am not sure how this is all happening. We have been online for about 15 years, and now we are at our lowest amount of traffic in about 10 years. Our sites are www.bestpricenutrition.com and www.mysupplementstore.com. We sell commodity items, but I have focused on unique product descriptions, tons of UGC, blog posts and guides for awhile now and it has always done us well. Until as of late.
This is what I feel led up to this, but I am hoping there is something I missed.
May 1st, 2018: Migrated www.bestpricenutrition.com and www.mysupplementstore.com from Shopify. Similar sites, but almost all unique content. We purchased www.mysupplementstore.com about 8 years ago. A ton of traffic and sales, which is why we didn't just redirect it.
Around May 25th: www.mysupplementstore.com took a big hit and lost almost 40% of its traffic. Nothing happened to www.bestpricenutrition.com, we actually increased traffic.
Aug 1st Update: www.mysupplementstore.com lost another 25% of its traffic. www.bestpricenutrition.com lost about 40% of it's traffic.
Sept 28th: Nothing happened to www.mysupplementstore.com, but www.bestpricenutrition.com lost another 30% of it's traffic.
So I have been trying to figure out if there is anything technically wrong, but doesn't seem so. These are issues we discovered in August.
- During the migration, the reviews from each site were syndicated to both websites. There were 1000's. This was resolved in mid August.
- During the migration, the company doing the migration pushed our blog posts to both websites. 100's of blog posts duplicated to each website. This was resolved mid August.
- We found that a disgruntled employee instead writing unique content for our product pages, she was copying them one from another. This was about 100 product pages, which we have since resolved.
What's Left
- I noticed on www.bestpricenutrition.com that we have 100's of blog posts that are getting hardly any traffic. I had trimmed www.mysupplementstore.com of this low traffic content. I am working on www.bestpricenutrition.com still.
I have been in this industry since 2003, survived 2012, but have exhausted everything I know to figure this out. It's another sob story I know, but trying to keep everyone's job alive here, but it doesn't look like it's going to happen. Any help would be greatly appreciated.
-
Hi Jeff,
This is a tough one. Very sorry to hear about your business losses.
As I'm sure you know, several the recent "core algorithm" updates from Google have focused on site quality. Via their Quality Rater program, they ask human reviewers, with specific guidelines, to dig not only into the content of the site but the background of the owners/writers. They test new substantial algo changes with this group before they update results for all users, with a specific focus on "Your Money Or Your Life" (YMYL) content, or pages that deal with serious and potentially life-altering topics.
A site offering health information and supplement advice, and also selling those supplements, is in the crosshairs of this kind of review.
This is just my take on it, but I expect your losses are largely due to the perceived brand trustworthiness. I'd consider toning down your on-site product promo imagery and ensuring trust-building elements (badges, ratings, testimonials, any kind of accreditations you have) are clearly visible above-the-fold. I'd also recommend building a more clear/clinical layout and typographical treatment for your advice content (blog posts, articles, etc). You might also want to consider limiting the array of supplements you promote and sell, staying away from the controversial and potentially dangerous.
I also, unfortunately, would not expect immediate results from this. These core algorithm updates come several times a year, but but I worked with an auto parts retailer who lost 30% of their organic traffic+revenue overnight in the "Phantom III" update (which seemed to be a general "quality" update similar to recent core algo updates) - they had some UX issues, content that seemed there just for SEO, etc. About a year after their big drop, they made a big push to improve UX/quality and add trust-building elements to their pages, and six months after this design/UX overhaul, they regained all of their traffic in the "Phantom V" update.
I suspect there is nothing technically broken with your site and that duplicate content and similar are not holding you back much - but that quality raters preferred search results with other sites for the keywords you've been ranking for.
First impressions of the brand, quality/trustworthiness of content, etc have big impact here - but these reviewers are also instructed to verify that the owners/publishers of the site are accredited and trustworthy as per other online sources:
"Many websites are eager to tell users how great they are. Some webmasters have read these rating guidelines and write 'reviews' on various review websites. But for Page Quality rating, you must also look for outside, independent reputation information about the website. When the website says one thing about itself, but reputable external sources disagree with what the website says, trust the external sources."
Not to suggest you have scam/similar accusations showing up online, but it's something additional I'd want to look into.
Best of Luck,
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the "Homepage" for an International Website With Multiple Languages?
BACKGROUND: We are developing a new multi-language website that is going to have: 1. Multiple directories for various languages:
Intermediate & Advanced SEO | | mirabile
/en-us, /de, etc....
2. Hreflang tags
3. Universal footer links so user can select their preferred language.
and
4. Automatic JS detection of location on homepage only, so that when the user lands on /, it redirect them to the correct location. Currently, the auto JS detection only happens on /, and no other pages of the website. The user can also always choose to override the auto-detection on the homepage anytime, by using the language-selector links on the bottom. QUESTION: Should we try to place a 301 on / to point to en/us? Someone recommended this to us, but my thinking is "NO" - we do NOT want to 301 /. Instead, I feel like we should allow Google Access to /, because that is also the most authoritative page on the website and where all incoming links are pointing. In most cases, users / journalists / publications IMHO are just going to link to /, not dilly dally around with the language-directory. My hunch is just to keep / as is, but also work to help Google understand the relationship between all of the different language-specific directories. I know that Google officially doesn't advocate meta refresh redirects, but this only happens on homepage, and we likewise allow user to override this at any time (and again, universal footer links will point both search engines and users to all other locations.) Thoughts? Thanks for any tips/feedback!2 -
Check website update frequency?
Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard
Intermediate & Advanced SEO | | seoman100 -
Canonical Help (this is a nightmare)
Hi, We're new to SEO and trying to fix our domain canonical issue. A while back we were misusing the "link canonical" tag such that Google was tracking params (e.g. session ids, tagging ) all as different unique urls. This created a nightmare as now Google thinks there's millions of pages associated with our domain when the reality is really a couple thousand unique links. Since then, we've tried to fix this by: 1) specifying params to ignore via SEO webmasters 2) properly using the canonical tag. However, I'm still recognizing there's a bunch of outsanding search results that resulted from this mess. Any idea on expectation on when we'd see this cleaned up? I'm also recognizing that google is looking at http://domain.com and https://domain.com as 2 different pages even though we specify to only look at "http://domain.com" via the link canonical tag. Again, is this just a matter of waiting for Google to update its results? We submitted a site map but it seems like it's taking forever for the results of our site to clear up... Any help or insight would greatly be appreciated!
Intermediate & Advanced SEO | | sfgmedia0 -
Risk Using "Nofollow" tag
I have a lot of categories (like e-commerce sites) and many have page 1 - 50 for each category (view all not possible). Lots of the content on these pages are present across the web on other websites (duplicate stuff). I have added quality unique content to page 1 and added "noindex, follow" to page 2-50 and rel=next prev tags to the pages. Questions: By including the "follow" part, Google will read content and links on pages 2-50 and they may think "we have seen this stuff across the web….low quality content and though we see a noindex tag, we will consider even page 1 thin content, because we are able to read pages 2-50 and see the thin content." So even though I have "noindex, follow" the 'follow' part causes the issue (in that Google feels it is a lot of low quality content) - is this possible and if I had added "nofollow" instead that may solve the issue and page 1 would increase chance of looking more unique? Why don't I add "noindex, nofollow" to page 2 - 50? In this way I ensure Google does not read the content on page 2 - 50 and my site may come across as more unique than if it had the "follow" tag. I do understand that in such case (with nofollow tag on page 2-50) there is no link juice flowing from pages 2 - 50 to the main pages (assuming there are breadcrumbs or other links to the indexed pages), but I consider this minimal value from an SEO perspective. I have heard using "follow" is generally lower risk than "nofollow" - does this mean a website with a lot of "noindex, nofollow" tags may hurt the indexed pages because it comes across as a site Google can't trust since 95% of pages have such "noindex, nofollow" tag? I would like to understand what "risk" factors there may be. thank you very much
Intermediate & Advanced SEO | | khi50 -
Please help with page
We used to use this page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html to rank for the words vinyl banner and PVC banner but we have tried to focus the page only on PVC banners and move the vinyl banners word to http://www.discountbannerprinting.co.uk/ yet for some reason even though they have both been spidered google has now chosen to rank this page http://www.discountbannerprinting.co.uk/stickers/vinyl-stickers.html for the vinyl banner words- how do I stop this from happening I thought the home page would be powerful enough to rank for the word with a title inclusion and a spread of the word on the page. Also if anyone can give their opinion on why they thinkhttp://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html does not rank very well I would be truly appreciative.
Intermediate & Advanced SEO | | BobAnderson0 -
Is there a way to stop my product pages with the "show all" catagory/attribute from duplicating content?
If there were less pages with the "show all" attribute it would be a simple fix by adding the canonical URL tag. But seeing that there are about 1,000 of them I was wondering if their was a broader fix that I could apply.
Intermediate & Advanced SEO | | cscoville0 -
Are dropdown menus bad for SEO
I have an ecommerce shop here: http://m00.biz/UHuGGC I've added a submenu for each major category and subcategory of items for sale. There are over 60 categories on that submenu. I've heard that loading this (and the number of links) before the content is very bad for SEO. Some will place the menu below the content and use absolute positioning to put the menu where it currently is now. It's a bit ridiculous in doing things backwards and wondering if search engines really don't understand. So the question is twofold: (1) Are the links better in a bottom loading sidemenu where they are now? (2) Given the number of links (about 80 in total with all categories and subcategories), is it bad to have the sidemenu show the subcategories which, in this instance, are somewhat important? Should I just go for the drilldown, e.g. show only categories and then show subcategories after? Truth is that users probably would prefer the dropdown with all the categories and second level subcategories, despite the link number and placement.
Intermediate & Advanced SEO | | attorney1 -
Where does "Pages Similar" link text come from?
When I type in a competitor name (in this case "buycostumes") Google shows several related websites in it's "Pages Similar to..." section at the bottom of the page:
Intermediate & Advanced SEO | | costumeMy question, can anyone tell me where the text comes from that Google uses as the link. Our competitors have nice branded links and our is just a keyword. I can find nothing on-page that Google is using so it must be coming from someplace off-page, but where?
0