Gulp Shenanigans
Ogma3, the software on which Genfic is ran, uses Gulp to process all the SASS styles, Javascript scripts, all the Typescript, everything. While I did experiment with various bundlers — and I mean various, everything from Webpack through Parcel to Snowpack — it was always Gulp that fit the workflow the most.
It’s not perfect, however, does have its issues, and the documentation is fairly outdated at times. Because of that, many people see that it takes 5 seconds to process their SCSS styles and bounces right out. In this post, I hope to solve at least two issues.
SASS compilation times get higher and higher
That was one of the issues I encountered. First compilation would be 2 seconds, the next one 2.5s, until a few compilations down the line it would reach some ridiculous times like 15 seconds. The only solution was to restart the watch task.
Or so I thought.
It turns out, the culprit is the /**/
glob pattern that the SASS plugin just doesn’t like. You might be tempted to
watch something like
1 | const watchCss = './public/css/src/**/*.sass'; |
so that changing all of your files, both the style.sass
located directly in src/
and all of the src/elements/
, src/pages/
, src/mixins/
and so on trigger the compilation task.
However, the way to avoid growing compile times, is to do something along the lines of
1 | const cssRoot = './public/css/src'; |
It is a bit more involved, and should you add another directory it will have to be added to this glob array as well, but it prevents growing compilation times. And, I’d say, that’s more important.
Why is it happening? I have no idea. I did try to create an issue on the repo of SASS, Gulp, and Gulp-SASS, but all I achieved was being bounced between the three with a short “uh, not our fault, must be one of the other packages”. If anybody has any clue why this performance issue happens, do reach out and let me know.
Use stream.pipeline()
instead of stream.pipe()
The official Gulp documentation tells you to create tasks this way:
1 | gulp.task('something', () => { |
The problem is twofold:
- To properly handle errors and make sure the stream is disposed of, you should really use
.on('error', e => ...)
after each.pipe(...)
- It’s based on magic strings so provides no autocompletion
Both of those can be solved by simply
using stream.pipeline()
. Just require
the pipeline function with
1 | const { pipeline } = require('stream'); |
and change your tasks to
1 | const something = () => pipeline(gulp.src('./somethings/src/*.sth'), |
An important note: pipeline()
takes an error handler as the last parameter, so I recommend
whipping up something simple like
1 | const errorHandler = (e) => { |
that can later be expanded as needed.
Defining your tasks this way makes compositing them much easier, stricter, and less prone
to error. You can still call them as usual, with gulp something
, and compositing multiple
tasks changes from
1 | gulp.task('foo', gulp.parallel('something', 'something-else')); |
into
1 | const foo = () => gulp.parallel(something, somethingElse); |
“But what if I have to use Node version less than 10?” I can hear somebody asking.
True, stream.pipeline()
was added in Node 10, but the unfortunate users of earlier
versions can rely on the Pump package that
provides pretty much the same functionality. I haven’t tested it myself, though,
so can’t speak for how reliable and similar it is exactly.
Note: This blogpost was originally talking about performance benefits this approach brings, but as it turns out it was due to my mistake. The tasks would report as completed way before they would actually complete, without actually completing early.
Because of that, I rewrote this post to talk strictly about the benefits that pipeline()
brings to error handling and compositing, not performance.