www, and parameters should not be an issue, robots file is ok (although waiting on the developer to change my_account and view_cart to my-account and view-cart)
On dev changes. This is a new site, and we have been struggling with some duplicate content generated by the ecommerce platform. We implemented a number of things to fix duplication issues around the same time this all started in google webmaster tools. Next and prev canonicals to the category pages and clean off session variables/refferal text, and canonicals on the product pages to clean off the session variables/referral text. Additionally the developer had a noindex tag on the product pages that we had them remove at the same time. Finally, we changed the content on the category pages from list with a grid view option to list view only and no followed the the secure account setting links like shopping cart, login etc.
I also have a number of fixes submitted to the developer for the site map, although to my knowledge it has not changed since day one. Changefreq is all messed up, it's randomly assigning this, no logic behind it, and 611 urls have // in between parameters instead of / could this be causing it? Follow my logic here, sitemap has all these pages with duplicate // in them, google hits the page, the canonicals we implemented says hey that's not it, it's / so then google ignores those pages in the sitemap. Is this it, or am I barking up the wrong tree? Any other thoughts?