URL Parameter for Limiting Results
-
We have a category page that lists products. We have parameters and the default value is to limit the page to display 9 products. If the user wishes, they can view 15 products or 30 products on the same page. The parameter is ?limit=9 or ?limit=15 and so on. Google is recognizing this as duplicate meta tags and meta descriptions via HTML Suggestions. I have a couple questions.
1. What should be my goal? Is my goal to have Google crawl the page with 9 items or crawl the page with all items in the category?
In Search Console, the first part of setting up a URL parameter says "Does this parameter change page content seen by the user?". In my opinion, I think the answer is Yes.
Then, when I select how the parameter affects page content, I assume I'd choose Narrows because it's either narrowing or expanding the number of items displayed on the page.
2. When setting up my URL Parameters in Search Console, do I want to select Every URL or just let Googlebot decide? I'm torn because when I read about Every URL, it says this setting could result in Googlebot unnecessarily crawling duplicate content on your site (it's already doing that). When reading further, I begin to second guess the Narrowing option. Now I'm at a loss on what to do.
Any advice or suggestions will be helpful! Thanks.
-
Thanks for your help David - I apologize for my delayed response.
-
Hi Dustin,
Looks like the problem is that you have two canonical tags on your parameter pages.
eg. on lines 24 and 25 of the source code for this page https://www.stickylife.com/custom/vinyl-decals?limit=30 you'll see:
With more than one canonical tag on a page, Google will ignore both canonical tags - which is why you are getting duplicate issues.
You'll need to remove the second canonical tag to overcome your issues.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Fetch and Render - Partial result (resources temporarily unavailable)
Over the past few weeks, my website pages have been showing as partial in the Google Search Console. There are many resources/ files (js, css, images) that are 'temporarily unreachable'. The website files haven't had any structural changes for about 2 years (it historically has always shows as 'completed' and rendered absolutely fine in the search console). I have checked and the robots.txt is fine as is the sitemap. My host hasn't been very helpful, but has confirmed there are no server issues. My website rankings have now dropped which I think is due to these resources issues and I need to clear this issue up asap - can any one here offer any assistance? It would be hugely appreciated. Thanks, Dan
SERP Trends | | dan_550 -
Search results vary in chrome vs other browsers even in Incognito mode: Google's stand?
Hi all, We use incognito mode or private browsing to check the Actual results which are not impacted by previous history, location (sometimes), etc. Even we browse this way, we can see the different search results. Why would this happen? What's Google's stand on this? What is the actual way to browse to get the unbiased results for certain search queries? I have experienced that Chrome will rank our own websites bit higher compared to the other browsers even in incognito mode. Thanks
SERP Trends | | vtmoz1 -
Which is the best and quick way to remove URL(s) from Google,Bing search engines?
"Remove URL", "Set Expiry in meta tag", "no index no follow " or some thing else.
SERP Trends | | ankit.rahevar0 -
External URLs - ones you can't reach out
My fellow Mozzers, I have been reviewing our Google Webmaster error reports and noticed high url errors. These URLs are generated from international sites, mainly China. Upon further inspection, these look to be links to dynamic URLs that are no longer active on our site. (search pages) China is linking to old URLs that simply spew out a 'Bad Request' pages now. Problems I face is that: I can't contact these chinese sites to remove/edit the URLs. I could work with my developers to identify the urls and direct them all to the homepage, but is that good. The URLs are still present. Some of these look like pages that haven't been updated in a while, so now I have links from sites that are archived, or "dead" Have you tackled anything like this before? Thoughts are welcome Thanks
SERP Trends | | Bio-RadAbs0 -
Appearing in Universal Results drops us from Organic Results
Hi all, Has anyone noticed that achieving an appearance in the Universal Results (7 box) forced their previous organic ranking to drop out completely for that keyword? I thought Google would still show us in Universal AND Organic. Is this typical? Here's what happened: Last Week: Ranked no. 6 in standard organic results for specific keyword (but 7 box universal results appear ahead of us between position 3 and 4 and we're not listed) This Week: We added ourselves to Google Places a few weeks ago and this week we suddenly appear in the desirable 7 box Universal result, which is much higher and better ranking - great! But interestingly we notice at the same time our normal organic ranking at no.6 has dropped out completely (-50 in moztool). Is it an either/or for Organic vs. Universal or can you ever keep ranking in both?
SERP Trends | | emerald0 -
Should URL Follow Navigation Of A Site?
Following an SEOMoz webinar the other day, where the presenter made a case of eliminating folders in URLs, as these could confuse a bot when crawling a site. I love the idea. However, there are still a lot of Best Practices and guidelines out there that will suggest there should be a logic in the URL, just as there should be in your Navigation. My question in that regard is whether or not there is any value for a bot to crawl a website URL that follows the navigation by "stuffing" the URL with folders, identical to the navigation present on the site, and even a secondary navigation present on all pages? Example: the navigation of a site goes [Domain > Folder > Sub folder > Sub-sub folder > product]. What is the benefit of using a URL such as [www.domain.com/folder/sub-folder/sub-sub-folder/product] vs [www.domain.com/product] Thank you guys for your insights! PS this is a WP site we are talking about.
SERP Trends | | Discountvc0 -
What are realistic goals for local serch results?
I have a brand new domain, 2 months old, and am just finishing design. What are realistic goals for the 1rst year in local search results? My competition appears to be the local directories. Manta, yellowpages, angieslist, BBB and so on. Is it realistic to expect to eventually outrank them for the local search terms? (City Service Keyword) If so, what is a realistic time frame for someone putting in 10 to 15 hours a week on SEO, Including content creation. This was the humbling report I just ran. Looks like my choices are to beat them or pay them excessive amounts of money. http://pro.seomoz.org/tools/keyword-difficulty/results.html?commit=Run+Report&date=2013-02-19&engine_id=96&keywords=Lakeville+snow+plowing&utf8=✓
SERP Trends | | dwallner0