Big problem with my new crawl report
-
I am owner of small opencart online store. I installed http://www.opencart.com/index.php?route=extension/extension/info&extension_id=6182&filter_search=seo. Today my new crawl report is awful. The number of errors is up by 520 (30 before), up with 1000 (120 before), notices up with 8000 (1000 before). I noticed that the problem is with search. There is a lot duplicate content in search only. What to do ?
-
Thank you again Alan.
Typo fixed.
-
I use Bing search API,
By the way, you want to change from GET to POST, not the other way around.
-
Alan,
Thank you for the great advice. If one has enough control over the eCommerce system, or the internal site search product, to change from GET to POST so these pages act more like real dynamically generated "search pages" than an infinite amount of "landing pages" I think that is a fantastic solution. It would keep merchandisers and others from linking to those pages - because we all know that they will continue to do it even if the SEO pleads on hands and knees for them to stop.
However, I have found it to be the case that most eCommerce businesses (from small mom-n-pop shops to fortune 500 companies) do not have the ability to do this because the internal site search functionality they use is out of their hands. Site search vendors like Endeca and Celebros serving enterprise eCommerce businesses don't typically hand over the keys to the client.
If you know any site search vendors or solutions that allow one to do this it would make a great contribution to this thread if you could share a few of them. I'd definitely look into recommending them in the future!
Thanks again!
-
The problem with PR leaks is that they are scalable, If you are losing 10%, then you get some quality links, 10% of them will be wasted, every effort you do in the future will be discounted by 10%.
There are ways to fix all these problems, for example I would make a search to be POST and not GET so that links to search pages can not be made and therefor search pages will not get indexed.
We work so hard to get good links, why waste them when you do?
-
I have tried different methods to fix this. First-hand experience tells me that oftentimes it is better to just block the paths (assuming there is better navigation on the site) from being crawled or indexed using robots.txt than to use a noindex,follow tag in order to save the pagerank you're sending via internal links. It is very easy for Google to get bogged down crawling around in the internal search results area.
Unless there are lots of links to search pages from top pages on the site, or a big list of search page links from every page (sitewide footer, for example) I really don't think the waste of internal pagerank is noticeable in the rankings, or worth salvaging if it risks sending spiders into a maze or a trap.
Yes, best practice is not to link to pages that you are blocking. In the real world though, search pages can be very useful to visitors, and to merchandisers who don't have the ability to create more targeted sub-sub-sub categories will often use them, and link to them on the site, as landing pages for promotional purposes (emails, PPC, sales...).
Everyone has their own strategies, and all we can do is make recommendations based on our own experience and knowledge. Thanks for helping out with this question Alan. Feel free to elaborate so Anastas has more input to help guide his decision.
-
as long as no one is linking to the search pages including internal links.
-
Hello Anastas,
I agree that you should block the search folder from being indexed. I'm going to assume that nobody is linking to your search pages and that you have other paths (e.g. SEO-friendly navigation, sitemaps...) for search engines to use to access your products).
I don't understand why you have formatted the disallow statement that way, however. Unless I'm missing something (and could be since I don't know what your site is) you only need to do this:
Disallow: /product/search*
And of course after doing this you should test it in GWT to make sure that A: You are blocking the pages you want to block, such as search pages with lots of parameters, and B: You are NOT blocking other pages you don't want to block, such as product pages. Here is more info on where to find the testing tool in GWT if you don't know: http://productforums.google.com/forum/#!topic/webmasters/tbikAxJiIZ4
Let us know how it goes. Good luck.
-
Please I need help
-
I am using opencart. I dont know what to do. Before I had 50 errors, now they are more than 500 after this plug in. The plug in removed the previous errors, but now there are many different errors. I have 2 options:
1. Remove the plug in
2. Do something with new errors - the new errors are only because of search, I have dublicate page content because when you type PDODUCT NAME in search box, there is same content as www.mydomain.com/category1/PRODUCT NAME
Maybe this plug in removed the canonical urls in search or I dont know what.
In robots.txt there is row:
Disallow: /*?route=product/search
The duplicate content is mydomain.com/product/search&filter_tag=XXXXXX
Instead of XXXXX there are many paths.
I decided to add another row in robots.txt:
Disallow: /*?route=product/search&filter_tag=/
Do you thing it is correct or to remove the plug in?
I hope you understand what is the problem.
-
When you no index a page, any links pointing to those pages pour away link juice from you indexed pages. you should never no-index pages IMO
I assume you are using a CMS or some sort of plug in, this is a common cost when you do so. CMS create very untidy code, not good for SEO
-
The urls are: /product/search&filter_tag=%D0%B1%D0%B8%D0%B6%D1%83%D1%82%D0%B0
after = there are a lot of combinations. Is it correct to put this in robots.txt
Disallow: /*?route=product/search&filter_tag=/
-
Sholud I disallow search (in robots.txt)?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify reported by Moz
According to Moz crawl report, there are hundreds of duplicate pages in our Shopify store ewatchsale.com. The main duplicate pages are:
On-Page Optimization | | ycnetpro101
https://ewatchsale.com/collections/seiko-watches?page=2
https://ewatchsale.com/collections/all/brand_seiko
(the canonical page should be https://ewatchsale.com/collections/seiko-watches) https://ewatchsale.com/collections/seiko-watches/gender_mens
(the canonical page should be https://ewatchsale.com/collections/seiko-watches/mens-watches) Also, I want to exclude indexing of pages URLs with "filter parameters" like https://ewatchsale.com/collections/seiko-watches/color_black+mens-watches+price_us-100-200 Shopify advised we can't access our robots.txt file. How can we exclude SE crawling of the page URLs with filter names?
How can we access the robots.txt file?
How can we add canonical code to the preferred collection pages? Which templates and what codes to add? Thanks for your advice in advance!0 -
Will I cannibalize a ranked product by introducing a new product in the series with the same keywords?
Currently http://aoi-corp.com/safety-monitors/series-1000 ranks #4 for Oxygen Deficiency Monitor, the newer product http://aoi-corp.com/oxygen-deficiency-monitor/series-1300-oxygen-deficiency-monitor needs to rank for the same keyword. Will I hurt the Series 1000 ranking? Thoughts/advice on strategy? Thanks.
On-Page Optimization | | JJSAMRA0 -
Understanding why our new page doesn't rank. Internal link structure to blame? + understand canonical pages more.
Hi guys. Sorry it's an essay...BUT, i think a lot of you will find this an interesting question. This question is in 2 (related) parts, and I imagine it would be an 'advanced' SEO question. Hoping you guys can help bring some real insight 🙂 Always amazed at the quality for this forum/ community. **Context... ** We had a duplicate content issue caused by this page and it's product permutations, so we placed canonical tags on all the product permutations to solve it. Worked a treat. However, we now have more **product ranges. **We now sell Diaries, Notebooks & Music books, which are clearly different from one another. So...we've placed canonical tags on all the product permutations leading back to the 'parent' theme. In other words, all the diary permutations 'lead back' to the diary page. All the notebooks permutations 'lead back' to the main notebook page. So on and so forth. Make sense so far? Context end..... Issue. Amazingly our Diary page outranks our notebook pagefor the search term 'Design your own Notebook'. The notebook page is well optimised for this search term, and the diary page avoids the word 'notebook' altogether (so no keyword cannibalisation going on). Possible reason? Our Diary page has a vast amount of internal links to it throughout our site. The notebook page has only a few. Could this be the issue? If so, what reading/ blogs/ content/ tools would you recommend to help understand and solve this problem? i.e) Better understanding internal link structure for SEO. 2nd part of the question (in the context of internal linking for SEO). When there are internal links to a page with a conical tag does that 'count' towards the 'parent page', or simply towards that specific page? I really hope that makes sense. If it's clear as mud just shout. Isaac. EDIT: All pages in question have been indexed since we added these changes to the site.
On-Page Optimization | | isaac6630 -
Google Places Problem
This may have been answered before but I have 2 questions. When I placed a business in Google Places, the "generic" ranking fell off the map. I now just have the 1 line Google places reference and that is all I can find. How can I get around that and get my 4 line description to show again? Do I have to delete my Places account? Before the Google Places account was built, the company was moving up the SERP ranks, now he is on pg 1 for Places but the other SERP positions have disappeared. This is true for all the keywords we are targeting. If there is not a Places reference he shows on Pg 3-5 (given the website is 4 weeks old, I think this is not bad). For the same client, he that services many of the surrounding communities. How do I get Google to recognize the various towns he services during a search? He places well for his "home" town but not at all for the other towns. if it helps any, the website is www.myairstat.com. Thanks for the help. Scott
On-Page Optimization | | scott5180 -
Trying to run page reports, but the system will not grab the latest changes to grade ?
I am trying to improve my page grades, however the system seems not not grab the latest updated fixed page, Please help. My Page is : http://pizza.codiedog.com/Best-Italian-Restaurant-in-San-Diego/index.shtml
On-Page Optimization | | dennislanglais0 -
N/A For On-page SEO Report
I have 8 errors and 44 warnings last week. Over the week, I corrected broken links, duplicate titles, meta tags, and duplicate page content. Now after it was run again, I'm getting an N/A. Does N/A mean there are NO errors at all on the site? I didn't fix all of the errors and warnings but most I did. My website is also up and running. Please let me know if I'm missing something or if I did that good of a job for my client's on-page SEO stuff, which I doubt despite moving the needle forward. Thanks
On-Page Optimization | | JQC0 -
Problem with Plugin: Wordpress SEO by Yoast
Hi Hi, I'm using the Plugin "Wordpress SEO by Yoast". Unfortunatelly, this plugin generates a sourcecode looking like this: name='description' content='XXXXXX.'/> Seomoz can't read this and tells me, that my pages have no description. Google doesn't matter and reads it correct. Is there anything I can do about this? Many greetings André
On-Page Optimization | | waynestock0 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | | SparkplugDigital0