Been stuck on seo duplication issues shopify
-
hey there
we have been working on some of our webshops and recently started with analytics/moz,but we have basicly hit a brick wall when it comes to www.krawattenwelt.de since we have had 5k high priority issues (duplicate content) and 20k medium priority issues now i have tried a large amount of solutions regarding the duplicate content issues but it didnt work so we basicly reverted it back to for now and i have the feeling i am really running out of options is there anyone who has an idea on how to do this?
duplicate content issues are as follows
example:http://krawattenwelt.de/collections/budget-9-15
issues with:http://krawattenwelt.de/collections/budget-9-15/modell_normal
and with:http://krawattenwelt.de/collections/budget-9-15/modell_normal?page=1
-
Let us know how it goes, Rashad.
-
So right now i have changed our canonical code to
{% if template contains 'collection' and current_tags or canonical_url contains 'page' %}
{% else %}
{% endif %}
doing a recrawl a the moment hopefully it works
-
Pointing to themselves won't produce a duplicate
True, but won't get rid of duplicate either.
-
Your canonicals look to be in order. Pointing to themselves won't produce a duplicate, what this helps with is retaining strength from modified URLs like referrer IDs. For example, if example.com/product has a canonical pointing to itself and I link to example.com/product?refid=1234, the strength from my link is going to be correctly passed to the product page rather than a non-existent refid path.
Moz has a pretty comprehensive post on the topic that might clear somethings up if you're unsure.
While the canonicalisation will clear up things like ?page=1, I think a better approach here is going to be trying to remove the duplication rather than mask it.
Each of the above pages really are exact duplicates which means they're not actually helping the user at all. Perhaps simplifying this structure and offering less filtering options could help to significantly decrease the number of pages you have to manage, clear up a lot of the duplicate content problems and optimise your crawl budget as well.
As an example, rather than having a page for ties within a certain budget range, why not remove this option entirely? As a helpful alternative, show them all ties in a category and allow them to sort by price from low to high instead.
Hopefully this all makes sense!
-
Hi there.
Ok, time to brush up on canonicals Canonical link is a link to another(!) page, if you want that page to "substitute" the original page in terms of ranking and juice flow in the eyes of search engines.
So, if you set canonical link to page itself (and that's how it is on pages you listed above), and those pages look exactly the same (and they do), of course they are going to be duplicates.
What you need to do is to set this-> http://krawattenwelt.de/collections/budget-9-15 as canonical for this-> http://krawattenwelt.de/collections/budget-9-15/modell_normal and use google parameter tool (https://www.google.com/webmasters/tools/crawl-url-parameters) to exclude this-> http://krawattenwelt.de/collections/budget-9-15/modell_normal?page=1 from indexing.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Metadata and duplicate content issues
Hi there: I'm seeing a steady decline in organic traffic, but at the same time and increase in pageviews and direct traffic. My site has about 3,000 crawl errors!! Errors are duplicate content, missing description tags, and description too long. Most of these issues are related to events that are being imported from Google calendars via ical and the pages created from these events. Should we block calendar events from being crawled by using the disallow directive in the robots.txt file? Here's the site: https://www.landmarkschool.org/
Reporting & Analytics | | BGR0 -
Referral Traffic Issue
I'm working on a site that has low traffic volumes due to its niche. That's fine but we are daily getting referral traffic from unrelated domains without a link. These visits are always 100% bounce which is impacting the overall click.through rate. The domains are not the same and different ones come through all the time, so it is difficult to keep on top of. Any ideas what could be going on here and an effective way of dealing with this?
Reporting & Analytics | | MickEdwards0 -
International SEO Domains & Avg Session Duration
I have a couple questions 1) Is there any SEO value to forwarding multiple domains to the same domain? For instance, we own dozens of of the same domain name but with International extensions. I haven't seen much on this topic and assume that Google ignores such domain names (they don't really have much of any links to them). 2) Is there any research on whether time on site is declining across the web? I've noticed a trend over the years but I want to make sure this is a standard user behavior as people jump around quicker and search hop for information.
Reporting & Analytics | | ScottOlson0 -
Best way to handle duplicate title on Home page?
Moz reports two links to the same Home page ad duplicate titles ... http://myhjhome.com/index.php
Reporting & Analytics | | ElykInnovation
http://myhjhome.com I'm not sure if I should just 301 redirect http://myhjhome.com/index.php to http://myhjhome.com, or if there is a better way to handle that? Or should I comb the website and make sure all links to the Home page dont include index.php? Just looking for some extra help here, learning as I'm going, thanks!!0 -
Moz Crawler suddenly reporting 1000s of duplicates (BE.net)
In the last 3-4 days we've had several thousand 'duplicate content' warnings appear in our crawl report, 99% of them related to our on-site blog. The blog is BlogEngine.Net, but the pages simply don't exist. The majority seem to be Roger trying quasi-random URLs like:
Reporting & Analytics | | Progauto
/?page=410 /?page=151 Etc. etc. The blog will present content for these requests, but it is of course the same empty page since there's only unique content for up to /?Page=10 or so. Two questions: 1. Did something change recently? These blogs have been up for months, and this problem has only come up this week. Did Roger change to become more aggressive lately? 2. Suggested remediation? On one of the blogs I've put no-index no-follow for any page that has a /?page querystring, and we'll see what effect that has come next crawl next week. However, I'm not sure this will work as per: http://moz.com/community/q/functionality-of-seomoz-crawl-page-reports Anyone else had dynamic blogs suddenly blossom into thousands of duplicate content warnings? Google (rightly) ignores these pages completely.0 -
SEO & CPC
So...I know Google secret sauce is just what secret but on a couple occasions I've seen a Phenomenon occur that I'd like some help understanding. I work at a fairly new eCommerce sit called www.savvisdirect.com. We have the normal new site desires...more traffic. Most of our traffic is coming from CPC but we were seeing about a 33% increase in organic wk over wk, until...we drastically cut our CPC spend on Google for a wk. Then organic dropped by about 1/3'rd. Nothing else changed, that's it. I know Google says that CPC spend doesn't affect rankings but I've seen this phenomenon a couple time @ a couple different sites & I'm getting to the point where I just don't believe it. What's your perspective on this?
Reporting & Analytics | | RPD0 -
Duplicate content? Split URLs? I don't know what to call this but it's seriously messing up my Google Analytics reports
Hi Friends, This issue is crimping my analytics efforts and I really need some help. I just don't trust the analytics data at this point. I don't know if my problem should be called duplicate content or what, but the SEOmoz crawler shows the following URLS (below) on my nonprofit's website. These are all versions of our main landing pages, and all google analytics data is getting split between them. For instance, I'll get stats for the /camp page and different stats for the /camp/ page. In order to make my report I need to consolidate the 2 sets of stats and re-do all the calculations. My CMS is looking into the issue and has supposedly set up redirects to the pages w/out the trailing slash, but they said that setting up the "ref canonical" is not relevant to our situation. If anyone has insights or suggestions I would be grateful to hear them. I'm at my wit's end (and it was a short journey from my wit's beginning ...) Thanks. URL www.enf.org/camp www.enf.org/camp/ www.enf.org/foundation www.enf.org/foundation/ www.enf.org/Garden www.enf.org/garden www.enf.org/Hante_Adventures www.enf.org/hante_adventures www.enf.org/hante_adventures/ www.enf.org/oases www.enf.org/oases/ www.enf.org/outdoor_academy www.enf.org/outdoor_academy/
Reporting & Analytics | | DMoff0 -
Google Analytic Tracking Issue (&utm_nooverride=1)
Hello, We have a problem that means we are unable to track our AdWords and organic work at all. Looking at "/All Traffic Sources" and clicking on "Ecommerce Tab" in Analytics we can see that (made up ratio :)):
Reporting & Analytics | | jannkuzel
£2 is attributed to Google/ CPC
£1 is attributed to Google / Organic
But £100 to Payment Provider/ referral and also various referrals from banking transaction pages. All of the revenue/conversions are being credited to the payment provider or the bank security checks the payment goes through. After having done some research we have found that the problem may be that Google Analytics attributes the purchase to the most recent click (on the payment provider button) rather than the initial click on the cpc campaign/organic or direct etc. Some people have suggested using the "&utm_nooverride=1"
tag which we wanted to run past you guys and confirm whether adding
this tag to the payment provider 'buy now' button on our website will
presumably fix this referral problem? Alternatively does the tag need
to be entered into our CPC campaigns as well? Or can you please guide
us in another way? We have also heard that "cross-domain" tracking could be the solution. So we are really confused what to do and where hoping someone had maybe been through something similar and could advice before we fully launch into a solution. In addition, it should be noted that our 'Goals Funnel Visualisation'
of 'checkout' breaks up at the penultimate stage of the checkout. All
customers exit through the /checkout_process (penultimate) but are recognised returning to the successful checkout page but there is a missing link in between these
two stages as 0% pass through is shown even though they do return? Thank you so much in advance for all your help.0