Main menu duplication
-
I am working on a site that has just gone through a migration to Shopify at the very same time as Google did an update in October. So problems from day 1.
All main menu categories have subsequently over the past 6 weeks fallen off a cliff. All aspects of the site have been reviewed in terms of technical, link profile and on-page, with the site in better shape than several ranking competitors.
One issue that i'd like some feedback on is the main menu which has 4 iterations in the source.
- desktop
- desktop (sticky)
- mobile
- mobile (sticky - appears as a second desktop sticky but I assume for mobile)
These items that are "duplicated" menus are the top level menu items only. The rest of the nested menu items are included within the last mobile menu option.
So
- desktop menu in source doesn't include any of the sub-menu items, the mobile version carries all these
- there are 4 versions of the top level main menu items in source
Should I be concerned? Considering we have significant issues should this be cleaned up?
-
A couple of other issues were uncovered with certain collections browser rendering. Cleaned up menu duplication and these. Monitoring.
-
You are right to be concerned and many in the SEO community don't really feel that Shopify has 'nailed' SEO yet. It started as a slightly nicer version of Wix where you could build your own site pretty easily but obviously they handle a lot of the eCommerce aspects as well (thus it's very attractive to business owners, sadly it's not great for SEO)
The community is expanding and the number of plugins and add-ons for Shopify is broadening. The problem is, many developers working on the Shopify platform don't have too much SEO experience (at least, that has been my experience of the Shopify community)
If you are finding that certain items are missing from the 'base' (non modified) source code, that is a concern. Google can technically crawl generated content and links (which are rendered client site), but that required headless browsers and client-side rendering. On average that takes 10x longer than basic source-scraping. Google's mission is to 'index the web', so although they have this new technology and functionality they wouldn't arbitrarily decide to take a 10x efficiency hit across all indexation (that would be nutty and would go against their prime directive)
Rendered crawling is deployed by Google for popular web pages. When it is used, it is not used with the same frequency as basic crawling - and not everyone gets that special treatment!
If you're not Santander or Coca Cola, you should be thinking about how you can help Google rather than how Google will "certainly use their latest technologies to help me, a small to medium business owner - at any expense!" - it just won't happen (sorry!)
The Shopify community is commerce and design led. One thing they are really bad at, is latching onto one-off isolated comments from Google (such as "we can crawl JavaScript now!") and then applying that to everything without testing it first in iterations. The fact is, sites that perform more server-side rendering do still perform better than sites which rely too heavily on client-side rendering (especially as that drastically impacts page-loading speeds and burdens the end user)
If I was finding lots of critical stuff that didn't appear in the base (non-modified) source code and my site wasn't a household name, I'd be really - really concerned!
I am sure that the right Shopify designers and developers could sort it out for you, but it may be costly. Especially as devs in that community won't believe you that it's necessarry, and will fire loads of posts to you (from Google) stating that what they have already done is fine. Comments from the horse's mouth are useful, but not without greater context
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are feeds bad for duplicate content?
One of my clients has been invited to feature his blog posts here https://app.mindsettlers.com/. Here is an example of what his author page would look like: https://app.mindsettlers.com/author/6rs0WXbbqwqsgEO0sWuIQU. I like that he would get the exposure however I am concerned about duplicate content with the feed. If he has a canonical tag on each blog post to itself, would that be sufficient for the search engines? Is there something else that could be done? Or should he decline? Would love your thoughts! Thanks.
Intermediate & Advanced SEO | | cindyt-17038
Cindy T.0 -
Does Google ignore duplicate meta descriptions?
Hi there SEO mozzers, I am dealing with a website that has duplicate meta descriptions (we know is bad).As a punishment, Google totally ignores the meta descriptions and picks content from the website and displays it in SERP. I already read the https://moz.com/blog/why-wont-google-use-my-meta-description but I was wondering if there is more information/knowledge out there. Any tips are appreciated!
Intermediate & Advanced SEO | | Europarl_SEO_Team0 -
Duplicate pages and Canonicals
Hi all, Our website has more than 30 pages which are duplicates. So canonicals have been deployed to show up only 10 of these pages. Do more of these pages impact rankings? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Opinion on Duplicate Content Scenario
So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites: www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?
Intermediate & Advanced SEO | | MeridianGroup0 -
How does google count a menu on each page
Hello, Just wondering how google treats the TOp and bottom menu that you see on each page of a website ? Does it count it on all the pages in terms of link juice, or is it just there for user experience and only what it counts are the links in the content of a page or on the side ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0