Will SASS ruin my SEO?
-
Hello, I am thinking about using SASS for my website, striping the current CSS style sheets and translating it all to SASS.. will this hurt my SEO?
-
I agree with this. One thing you have to watch out for in using SASS is alienating older browsers or IE. With control like SASS comes problems also. Here lately I have seen a lot of cases where people use SASS and end up with too many selectors for IE to handle. Depending on the size of the site and how it is generated, this can be a real issue.
Also, I would precompile and not compile at run time as well. It will add a processing overhead if you don't precompile.
-
No, that's not true.
Think SASS or SCSS are processed at server-side, and get a normal CSS stylesheet to the client.
SASS or SCSS can help you to manage your CSS files and classes, but you need to pre-process them to generate the final CSS sended to the client. If the final product doesn't change, seo will not change too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Parallax, SEO, and Duplicate Content
We are working on a project that uses parallax to provide a great experience to the end user, and we are also trying to create a best case scenario for SEO. We have multiple keywords we are trying to optimize. We have multiple pages with the parallax function built into it. Basically each member of the primary navigation is it's own page, with all subpages built below it using the parallax function. Our navigation currently uses the hashbang method to provide custom URL's for each subpage. And the user is appropriately directed to the right section based on that hashbang. www.example.com/About < This is its own page www.example.com/about/#/history < This is a subpage that you scroll to on the About page We are trying to decide what the best method will be for trying to optimize each subpage, but my current concern is that because each subpage is really a part of the primary page, will all those URL's be seen as duplicate content? Currently the site can also serve each subpage as it's own page as well, so without the parallax function. Should I include those as part of the sitemap. There's no way to navigate to them unless I include them in the sitemap, but I don't want Google to think I'm disingenuous in providing them links that don't exist, solely for the purpose of SEO, but truthfully all of the content exists and is available to the user. I know that a lot of people are asking these questions, and there really are no right answers yet, but I'm curious about everyone else's experience so far.
Web Design | | PaulRonin2 -
On Page Local SEO
What do you believe is the best approach when it comes to Local SEO for businesses in 2013?
Web Design | | BlueRockDigital0 -
Will keyword optimization for a landing page impact SEO for subsequent pages?
For example, if I optimize keyword “pleurx” really well on our landing page, I'd like to know if subsequent
Web Design | | Todd_Kendrick
pages linking back to that landing page will rank higher than before for “pleurx”
even if “pleurx” wasn't optimized on the subsequent pages. Thanks! -Andrew0 -
Seo for design webinar ?
I've got no problem using google webfonts on their own, but what about using them over an image; specifically a clickable image? Its easiest to place text over an image if the image is a background image, but then the image isn't easy to make clickable. Am I missing something - this shouldn't be hard, right. Thanks!
Web Design | | saultie0 -
Ecommerce SEO - product sort order
Hi, I've been trying to find the answer to this in google but having no luck. In the current era, is it damaging to have products ordered randomly in an ecommerce website? Also, how long would you suggest is a good length of time to establish your natural rank? Ive launched and still work on several succesful ecommerce sites, but have recently launched a completely new venture - brand new url, brand new site and it has been live for around 5 weeks now, and although it is being found in search, it isnt doing as well as i'd like using the moz pro tools ive picked up some issues and have in the last few days tweaked page titles, added 'nofollow' to all my filters, added content etc, so I feel as though ive reset the clock. the site (it's an adult site by the way) is www.lovesauce.co.uk - would appreciate some feedback from the pro's
Web Design | | tom.dollar0 -
Will I get penalised for display:none ?
I have initial content that is dislayed for 10secs and then collapsed and replaced by a div that was hidden (display:none). Will the hidden text be used by Google or will they consider it as page stuffing? If so, is there any recommendations on how to handle this. The goal was to maxamize screen real estate for the human visitor without adding clutter.
Web Design | | oznappies0 -
Local SEO - Title Tag?
For www.bluedotlandscaping.com/fencing.htm in which we mostly only care about Greenvile and Spartanburg counties in SC. Is <title>Patio designs - Water features - Brick patio by Blue Dot Landscaping</title>
Web Design | | SCyardman
good... or or do you prefer... <title>Patio designs - Water features - Brick patio - Greenville, Spartanburg, Simpsonville</title> Thanks for your help, Rich0