Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
-
Hi all,
So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit.
I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS?
Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time?
Thanks!
-
I read your post at Mstoic Hemant and noticed your comment about Firefox 10. Since I couldn't get Dust-Me Spider to work in my current version of Firefox I tried downloading and installing the older version 10 as you suggested. When I did so, I received the message that the Dust-Me Spider was not compatible with this version of Firefox and it was disabled.
We are considering purchasing the paid version of Unused CSS (http://unused-css.com/pricing) - Do you have any experience using the upgraded version? Does it deliver what it promises?
Thanks!
-
Hi Hemant,
I tried using Dust-Me in Firefox, but for some reason it won't work on this sitemap: http://www.ccisolutions.com/rssfeeds/CCISolutions.xml
Could it be that this sitemap is too large? I even tried setting up a local folder to store the data, but everytime I try the spider I get the message "The sitemap has no links."
I am using Firefox 27.0.1
-
Hi Dana, did either of these responses help? What did you end up settling on? We'd love an update! Thanks.
Christy
-
I have an article on that here. An extension for firefox called Dust-Me selectors can help you identify unused CSS on multiple pages. It tracks all the pages you visit of a website and tracks classes and ids which were never used. Moreover, you can also give it a sitemap and it will figure out the CSS which was never used.
-
This sounds like it might just do the trick. You'll need to have Ruby installed for it to work. If you have a Mac, it's already on there. If you have a Windows you'll need this. It's pretty easy, I installed Ruby on my Windows gaming rig. If you're running a Linux flavor, try this.
Just take your URLs from the site crawl and make a txt file. You can compare that with your CSS file. I've never tried it on a large site, let me know how it goes for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WordPress redirects are taking too long to navigate: Anyone ever faced this?
Hi community, We are using wordpress website. We have redirected hundreds of URLs from wordpress redirect manager for last 10 years around. Suddenly from last one week, the redirects are taking too long to navigate to the pages; like around 1 minute. Could you anybody face the same issue? Please help me on this. Thanks
Web Design | | vtmoz0 -
How to fix non-crawlable pages affected by CSS modals?
I stumbled across something new when doing a site audit in SEMRUSH today ---> Modals. The case: Several pages could not be crawled because of (modal:) in the URL. What I know: "A modal is a dialog box/popup window that is displayed on top of the current page" based on CSS and JS. What I don't know: How to prevent crawlers from finding them.
Web Design | | Dan-Louis0 -
Site Migration due to Corporate Acquisition
Hey everyone, Wanted to check-in on something that I've been thinking way too much about lately. I'll do my best to provide background, but due to some poor planning, it is rather confusing to wrap your head around. There are currently three companies involved, Holding Corp (H Corp) and two operating companies, both in the same vertical but one B2B and the other is B2C. B2C corp has been pushed down the line and we're focusing primarily on H Corp and B2B brand. Due to an acquisition of H Corp and all of it's holdings, things are getting shuffled and Ive been brought in to ensure things are done correctly. What's bizarre is H Corp and it's web property are the dominant authority in SERPs for the B2B brand. As in B2B brand loses on brand searches to H Corp, let alone any product/service related terms. As such, they want to effectively migrate all related content from H Corp site to B2B brand site and handover authority as effectively as possible. Summary: Domain Migration from H Corp site to B2B Brand site. Ive done a few migrations in my past and been brought in to recover a few post-launch so I have decent experience and a trusted process. One of my primary objectives initially is change as little as possible with content, url structure (outside the root) etc so 301s are easy but also so it doesn't look like we're trying to play any games. Here's the thing, the URL structure for H Corp is downright bad from both a UX perspective and a general organizational perspective. So Im feeling conflicted and wanted to get a few other opinions. Here are my two paths as I see and Id love opinions on both: stick with a similar URL structure to H Corp through the migration (my normal process) but deviate from pretty much every best practice for structuring URLs with keywords, common sense and logic. Pro: follow my process (which has always worked in the past) Con: don't implement SEO/On-page best practices at this stage and wait for the site redesign to implement best practices (more work) Implement new URL structure now and deviate from my trusted process. Do you see a third option? Am I overthinking it? Other important details: B2B brand is under-going a site redesign, mostly aesthetic but their a big corporation and will likely take 6-9 months to get up. Any input greatly appreciated. Cheers, Brent
Web Design | | pastcatch1 -
My news site not showing in "In the news" list on Google Web Search
I got a news website (www.tapscape.com) which is 6 years old and has been on Google News since 2012. However, whenever I publish a news article, it never shows up "In the news" list on Google Web Search. I have already added the schema.org/NewsArticle on the website and have checked it if it's working or not on Google structured data testing tool. I see everything shows on on the structured data testing tool. The site already has a news sitemap (http://www.tapscape.com/news-sitemap.xml) and has been added to Google webmaster tools. News articles show perfectly fine in the News tab, but why isn't the articles being shown on "In the news" list on the Google web search? My site has a strong backlink background already, so I don't think I need to work on the backlinks. Please let me know what I'm doing wrong, and how can I get it to the news articles on "In the news" list. Below is a screenshot that I have attached to this question to help you understand what I mean to say. 1qoArRs
Web Design | | hakhan2010 -
Is it against google guidelines to use third party review sites as well as have reviews on my site marked up with schema?
So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?
Web Design | | BearPaw881 -
Hiding content until user scrolls - Will Google penalize me?
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections. I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it). Is this still the case? Thanks, Sam
Web Design | | Sam.at.Moz0 -
Will interlinking using dynamic parameters in url help us in increasing our rankings
Hi, Will interlinking our internal pages using dynamic parameters(like abc.com/property-in-noida?source=footer) help us in increasing our rankings for linked pages OR we should use static urls for interlinking Regards
Web Design | | vivekrathore0 -
Will SASS ruin my SEO?
Hello, I am thinking about using SASS for my website, striping the current CSS style sheets and translating it all to SASS.. will this hurt my SEO?
Web Design | | DanielBernhardt0