Big problem with my new crawl report
-
I am owner of small opencart online store. I installed http://www.opencart.com/index.php?route=extension/extension/info&extension_id=6182&filter_search=seo. Today my new crawl report is awful. The number of errors is up by 520 (30 before), up with 1000 (120 before), notices up with 8000 (1000 before). I noticed that the problem is with search. There is a lot duplicate content in search only. What to do ?
-
Thank you again Alan.
Typo fixed.
-
I use Bing search API,
By the way, you want to change from GET to POST, not the other way around.
-
Alan,
Thank you for the great advice. If one has enough control over the eCommerce system, or the internal site search product, to change from GET to POST so these pages act more like real dynamically generated "search pages" than an infinite amount of "landing pages" I think that is a fantastic solution. It would keep merchandisers and others from linking to those pages - because we all know that they will continue to do it even if the SEO pleads on hands and knees for them to stop.
However, I have found it to be the case that most eCommerce businesses (from small mom-n-pop shops to fortune 500 companies) do not have the ability to do this because the internal site search functionality they use is out of their hands. Site search vendors like Endeca and Celebros serving enterprise eCommerce businesses don't typically hand over the keys to the client.
If you know any site search vendors or solutions that allow one to do this it would make a great contribution to this thread if you could share a few of them. I'd definitely look into recommending them in the future!
Thanks again!
-
The problem with PR leaks is that they are scalable, If you are losing 10%, then you get some quality links, 10% of them will be wasted, every effort you do in the future will be discounted by 10%.
There are ways to fix all these problems, for example I would make a search to be POST and not GET so that links to search pages can not be made and therefor search pages will not get indexed.
We work so hard to get good links, why waste them when you do?
-
I have tried different methods to fix this. First-hand experience tells me that oftentimes it is better to just block the paths (assuming there is better navigation on the site) from being crawled or indexed using robots.txt than to use a noindex,follow tag in order to save the pagerank you're sending via internal links. It is very easy for Google to get bogged down crawling around in the internal search results area.
Unless there are lots of links to search pages from top pages on the site, or a big list of search page links from every page (sitewide footer, for example) I really don't think the waste of internal pagerank is noticeable in the rankings, or worth salvaging if it risks sending spiders into a maze or a trap.
Yes, best practice is not to link to pages that you are blocking. In the real world though, search pages can be very useful to visitors, and to merchandisers who don't have the ability to create more targeted sub-sub-sub categories will often use them, and link to them on the site, as landing pages for promotional purposes (emails, PPC, sales...).
Everyone has their own strategies, and all we can do is make recommendations based on our own experience and knowledge. Thanks for helping out with this question Alan. Feel free to elaborate so Anastas has more input to help guide his decision.
-
as long as no one is linking to the search pages including internal links.
-
Hello Anastas,
I agree that you should block the search folder from being indexed. I'm going to assume that nobody is linking to your search pages and that you have other paths (e.g. SEO-friendly navigation, sitemaps...) for search engines to use to access your products).
I don't understand why you have formatted the disallow statement that way, however. Unless I'm missing something (and could be since I don't know what your site is) you only need to do this:
Disallow: /product/search*
And of course after doing this you should test it in GWT to make sure that A: You are blocking the pages you want to block, such as search pages with lots of parameters, and B: You are NOT blocking other pages you don't want to block, such as product pages. Here is more info on where to find the testing tool in GWT if you don't know: http://productforums.google.com/forum/#!topic/webmasters/tbikAxJiIZ4
Let us know how it goes. Good luck.
-
Please I need help
-
I am using opencart. I dont know what to do. Before I had 50 errors, now they are more than 500 after this plug in. The plug in removed the previous errors, but now there are many different errors. I have 2 options:
1. Remove the plug in
2. Do something with new errors - the new errors are only because of search, I have dublicate page content because when you type PDODUCT NAME in search box, there is same content as www.mydomain.com/category1/PRODUCT NAME
Maybe this plug in removed the canonical urls in search or I dont know what.
In robots.txt there is row:
Disallow: /*?route=product/search
The duplicate content is mydomain.com/product/search&filter_tag=XXXXXX
Instead of XXXXX there are many paths.
I decided to add another row in robots.txt:
Disallow: /*?route=product/search&filter_tag=/
Do you thing it is correct or to remove the plug in?
I hope you understand what is the problem.
-
When you no index a page, any links pointing to those pages pour away link juice from you indexed pages. you should never no-index pages IMO
I assume you are using a CMS or some sort of plug in, this is a common cost when you do so. CMS create very untidy code, not good for SEO
-
The urls are: /product/search&filter_tag=%D0%B1%D0%B8%D0%B6%D1%83%D1%82%D0%B0
after = there are a lot of combinations. Is it correct to put this in robots.txt
Disallow: /*?route=product/search&filter_tag=/
-
Sholud I disallow search (in robots.txt)?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fixing Index Errors in the new Google Search Console - Help
Hi, So I have started using the new Search Console and for one of my clients, there are a few 'Index Coverage Errors'. In the old version you could simply, analyse, test and then mark any URLs as fixed - does anyone know if that is possible in the new version? There are options to validate errors but no 'mark as fixed' options. Do you need to validate the errors before you can fix them?
On-Page Optimization | | daniel-brooks0 -
Could you double check my new product markups?
Hey Mozzers! We are currently running a Google Shopping Ad campaign and one of the suggestions we had gotten from Google was to add product markups. I hired a developer to add the product markup to the product page templates (Magento, if it matters) and I just need help to double check his work before I approve and pay the final invoice. I've attached a screenshot of the page source with the item markups. I've also dropped the page source into Structured Data Testing Tool to see if there were any warnings. No warnings or errors. So it seems as if everything looks good, but I just wonder if it's a similar approach to SEO where the product markups could be better optimized. Thanks for the expert eyes on this! ruKjc
On-Page Optimization | | localwork1 -
Moz crawl
Hi, 1. Is there any way to get the moz crawl to analyse any site like the website auditor tool in seo powersuite on command. It would be good to see what keyword my competitors are using. would also like to analyse my site when i have corrected errors on command. 2. Is google adwords keyword tool still the best keyword research tool? I Don't see anything in the newbie guides to the moz pro tool regarding keyword research. Thanks! David.
On-Page Optimization | | WallerD0 -
Help an SEO-DUMMY : ) Established hyphenated domain...redirect?!...new domain?!
Hello, everybody. I am definitely not an SEO specialist. My family owns a transportation business (since 2010) and i am the one responsible for the website (until we find a good SEO company). My question: Several years ago i did not know much about SEO and have chosen a domain name www.airporttransportation-limo.com (it is not the actual domain...just an example...i'm not sure if i can post the real website here) and another domain that is just the name of our company (it also has hyphen in it). Both websites are still doing good and we receive quite a bit of traffic, but i read more an more about how hyphenated domains and domains with more then two worlds can be bad for your SEO/business/traffic. I feel like the websites are stuck and not moving up any more..could that be because of the hyphens? I registered another domain that is the name of our company (which is well known by now) without any hyphens. Now i have no idea what to do. Should i redirect both old domains (old websites are different and do not have duplicate content) to the new one, or should i just redirect the old domain (just the name of our company with hyphen) to a new one (without hyphen) and leave the www.airportransportation-limo.com as is... Or maybe i should register another domain without any hyphens (two words only) and redirect the www.airporttransportation-limo.com to it... I am very nervous to make any changes and loose all the traffic. My family will kill me. Please help! I'm lost!
On-Page Optimization | | KL20140 -
Is there a problem when passages of text are repeating on varios pages?
Hi Everybody, I am working on travel-pages and there are 4 tours that all start their itinerary in "Manaus", so I have 4 pages containing all a passage of the same text about Manaus. How will Google interpret this? Just pick one of the four pages in relation to Manaus Keywords and index, or index none of them? What should I do, because there is no canonical tag for parts of pages? I'm curious about what you think 🙂 Best regards,
On-Page Optimization | | inlinear
Holger0 -
SEO for New Magento Site??!?
Hi All, We have had a new site delivered by our developers (hurrah...the old one was to terrible for words) but we seem to be having a lot of issues with the new Magento platform. What they have done is used the community version and tried to customize it. I came on to this project about a month into the build and although Magento does seem to do a lot things well there do seem to be some problems. From an SEO prespective we have seen some increase on some search terms and a drop of in others. I would be interested in hearing from other Magento users about their experiences with this platform and any ideas on how to crank up the activity. Our site is at www.nationwidepharmacies.co.uk . There are few odd bits including the side navigation which seems to be very clunky and not overly customizable in this version. Any useful criticism would also be well received. Look forward to hearing Nic
On-Page Optimization | | nicc19760 -
New Site + Transfered blog: Things to watch for?
Hey everyone. So here is what's going on. 1. Built new website. Will be using generally the same URL's as original. 2. Incorporated and expanded on a previously separate informative blog. a. Many articles contain very similar if not the same articles as our blog. 3. The blog will be taken down as soon as new site is put online. So my question is just. Could someone list the like top 5 important things I need to keep in mind before putting this website live? I don't mind searching forums and blogs for the how-to part; I just need to know what I should be thinking about and getting done pre-launch. I believe re-directing the blog articles should be done? Create and submit sitemap? Thanks, Web-creation newb
On-Page Optimization | | Earthsaver0 -
Best information organization for a new site?
I'm launching a new stain removal website, and wanted to know what would be considered the best way to organize the content? Since most articles will roughly involve "removing X from Y" or "how to remove Z," I can see two ways... 1. Organize articles by Stained Items, Stain Agents and perhaps Cleaning Detergents. 2. Spread the categories out more, to try and group stained items according to categories... E.g. Hard surfaces, delicates, fabrics, ceramics etc. Any thoughts on which of these two might be the best way to organize the site, or are there any better suggestions? Not sure what the main considerations are here... Either of these two seem equally user-friendly.
On-Page Optimization | | ZakGottlieb710