Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2024-12-01, Version 23.4.0 (Current), @aduh95 #17

Closed
wants to merge 53 commits into from
Closed
Changes from 1 commit
Commits
Show all changes
53 commits
Select commit Hold shift + click to select a range
882b70c
tools: bump cross-spawn from 7.0.3 to 7.0.5 in /tools/eslint
dependabot[bot] Nov 18, 2024
72eb710
tools: fix riscv64 build failed
luyahan Nov 18, 2024
f9d25ed
doc: add history entry for import assertion removal
aduh95 Nov 18, 2024
83e02dc
build: compile bundled ada conditionally
jirutka Nov 16, 2024
d8eb83c
build: compile bundled simdjson conditionally
jirutka Nov 16, 2024
fffabca
build: compile bundled simdutf conditionally
jirutka Nov 16, 2024
9c61038
deps: update simdutf to 5.6.2
nodejs-github-bot Nov 19, 2024
efb9f05
doc,lib,src,test: unflag sqlite module
cjihrig Nov 19, 2024
8a5d8c7
test_runner: mark context.plan() as stable
cjihrig Nov 19, 2024
8197815
test_runner: mark snapshot testing as stable
cjihrig Nov 19, 2024
219f5f2
doc: include git node release --promote to steps
RafaelGSS Nov 19, 2024
f711a48
doc: fix relative path mention in --allow-fs
RafaelGSS Nov 19, 2024
d093820
tools: lint js in `doc/**/*.md`
LiviaMedeiros Nov 20, 2024
497a9ae
src: fix kill signal on Windows
huseyinacacak-janea Nov 20, 2024
493e16c
test: fix determining lower priority
LiviaMedeiros Nov 20, 2024
5a2a757
doc: add esm examples to node:timers
mfdebian Nov 20, 2024
f48e289
build: fix GN build for sqlite
zcbenz Nov 21, 2024
7768b3d
deps: update simdjson to 3.10.1
nodejs-github-bot Nov 21, 2024
9d07880
doc: remove RedYetiDev from triagers team
Nov 21, 2024
d777d4a
sqlite: add `StatementSync.prototype.iterate` method
tpoisseau Nov 21, 2024
29362ce
test: make x509 crypto tests work with BoringSSL
codebytere Nov 22, 2024
c8bb8a6
doc: fix Node.js 23 column in CHANGELOG.md
richardlau Nov 22, 2024
475141e
tools: add linter for release commit proposals
aduh95 Nov 22, 2024
ccc5a6d
doc: document approach for building wasm in deps
mhdawson Nov 22, 2024
26ec996
build: use variable for crypto dep path
codebytere Nov 23, 2024
4be5047
module: do not warn when require(esm) comes from node_modules
joyeecheung Nov 23, 2024
ff48c29
doc: add esm example for zlib
peixotoleonardo Nov 23, 2024
cf3f7ac
deps: update zlib to 1.3.0.1-motley-7e2e4d7
nodejs-github-bot Aug 18, 2024
93d36bf
crypto: allow non-multiple of 8 in SubtleCrypto.deriveBits
panva Oct 6, 2024
1e0decb
doc: doc how to add message for promotion
mhdawson Nov 13, 2024
2023b09
build: add create release proposal action
RafaelGSS Nov 23, 2024
98f8f4a
doc: order `node:crypto` APIs alphabetically
badkeyy Nov 23, 2024
a4f57f0
assert: add partialDeepStrictEqual
puskin94 Nov 23, 2024
c157e02
test: convert readdir test to use test runner
tchetwin Nov 23, 2024
288416a
deps: upgrade npm to 10.9.1
npm-cli-bot Nov 24, 2024
f7567d4
test: make HTTP/1.0 connection test more robust
FliegendeWurst Nov 24, 2024
c048865
test_runner: simplify hook running logic
cjihrig Nov 25, 2024
30f26ba
lib: avoid excluding symlinks in recursive fs.readdir with filetypes
juanarbol Nov 25, 2024
95e8c4e
test_runner: refactor build Promise in Suite()
cjihrig Nov 22, 2024
7c3a4d4
test_runner: refactor Promise chain in run()
cjihrig Nov 22, 2024
32b1681
tools: use tokenless Codecov uploads
targos Nov 25, 2024
7705724
doc: add vetted courses to the ambassador benefits
mcollina Nov 25, 2024
a3f7db6
doc: add doc for PerformanceObserver.takeRecords()
skyclouds2001 Nov 25, 2024
5b0ce37
assert: optimize partial comparison of two `Set`s
aduh95 Nov 25, 2024
baed276
doc: deprecate passing invalid types in `fs.existsSync`
Ceres6 Nov 25, 2024
1fb30d6
quic: multiple updates to quic impl
jasnell Nov 23, 2024
d180a8a
deps: update simdutf to 5.6.3
nodejs-github-bot Nov 26, 2024
96e846d
deps: update ngtcp2 to 1.9.0
nodejs-github-bot Nov 26, 2024
f99f95f
deps: update corepack to 0.30.0
nodejs-github-bot Nov 26, 2024
9289374
http2: fix memory leak caused by premature listener removing
ywave620 Nov 26, 2024
ce53f16
build: set node_arch to target_cpu in GN
codebytere Nov 26, 2024
7133c04
build: avoid compiling with VS v17.12
StefanStojanovic Nov 26, 2024
971f5f5
src: safely remove the last line from dotenv
islandryu Nov 26, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
doc: add esm example for zlib
PR-URL: nodejs#55946
Reviewed-By: Antoine du Hamel <duhamelantoine1995@gmail.com>
Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
  • Loading branch information
peixotoleonardo authored and aduh95 committed Nov 26, 2024
commit ff48c29724c855056ba9dcd9a0094c5f2baafd23
243 changes: 230 additions & 13 deletions doc/api/zlib.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,11 @@ Gzip, Deflate/Inflate, and Brotli.

To access it:

```js
```mjs
import os from 'node:zlib';
```

```cjs
const zlib = require('node:zlib');
```

Expand All @@ -21,13 +25,35 @@ Compressing or decompressing a stream (such as a file) can be accomplished by
piping the source stream through a `zlib` `Transform` stream into a destination
stream:

```js
const { createGzip } = require('node:zlib');
const { pipeline } = require('node:stream');
```mjs
import {
createReadStream,
createWriteStream,
} from 'node:fs';
import process from 'node:process';
import { createGzip } from 'node:zlib';
import { pipeline } from 'node:stream';

const gzip = createGzip();
const source = createReadStream('input.txt');
const destination = createWriteStream('input.txt.gz');

pipeline(source, gzip, destination, (err) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
});
```

```cjs
const {
createReadStream,
createWriteStream,
} = require('node:fs');
const process = require('node:process');
const { createGzip } = require('node:zlib');
const { pipeline } = require('node:stream');

const gzip = createGzip();
const source = createReadStream('input.txt');
Expand All @@ -39,17 +65,43 @@ pipeline(source, gzip, destination, (err) => {
process.exitCode = 1;
}
});
```

// Or, Promisified
Or, using the promise `pipeline` API:

const { promisify } = require('node:util');
const pipe = promisify(pipeline);
```mjs
import {
createReadStream,
createWriteStream,
} from 'node:fs';
import process from 'node:process';
import { createGzip } from 'node:zlib';
import { pipeline } from 'node:stream/promises';

async function do_gzip(input, output) {
const gzip = createGzip();
const source = createReadStream(input);
const destination = createWriteStream(output);
await pipeline(source, gzip, destination);
}

await do_gzip('input.txt', 'input.txt.gz');
```

```cjs
const {
createReadStream,
createWriteStream,
} = require('node:fs');
const process = require('node:process');
const { createGzip } = require('node:zlib');
const { pipeline } = require('node:stream/promises');

async function do_gzip(input, output) {
const gzip = createGzip();
const source = createReadStream(input);
const destination = createWriteStream(output);
await pipe(source, gzip, destination);
await pipeline(source, gzip, destination);
}

do_gzip('input.txt', 'input.txt.gz')
Expand All @@ -61,7 +113,39 @@ do_gzip('input.txt', 'input.txt.gz')

It is also possible to compress or decompress data in a single step:

```js
```mjs
import process from 'node:process';
import { Buffer } from 'node:buffer';
import { deflate, unzip } from 'node:zlib';

const input = '.................................';
deflate(input, (err, buffer) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
console.log(buffer.toString('base64'));
});

const buffer = Buffer.from('eJzT0yMAAGTvBe8=', 'base64');
unzip(buffer, (err, buffer) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
console.log(buffer.toString());
});

// Or, Promisified

import { promisify } from 'node:util';
const do_unzip = promisify(unzip);

const unzippedBuffer = await do_unzip(buffer);
console.log(unzippedBuffer.toString());
```

```cjs
const { deflate, unzip } = require('node:zlib');

const input = '.................................';
Expand Down Expand Up @@ -104,7 +188,19 @@ limitations in some applications.
Creating and using a large number of zlib objects simultaneously can cause
significant memory fragmentation.

```js
```mjs
import zlib from 'node:zlib';
import { Buffer } from 'node:buffer';

const payload = Buffer.from('This is some data');

// WARNING: DO NOT DO THIS!
for (let i = 0; i < 30000; ++i) {
zlib.deflate(payload, (err, buffer) => {});
}
```

```cjs
const zlib = require('node:zlib');

const payload = Buffer.from('This is some data');
Expand Down Expand Up @@ -138,7 +234,47 @@ Using `zlib` encoding can be expensive, and the results ought to be cached.
See [Memory usage tuning][] for more information on the speed/memory/compression
tradeoffs involved in `zlib` usage.

```js
```mjs
// Client request example
import fs from 'node:fs';
import zlib from 'node:zlib';
import http from 'node:http';
import process from 'node:process';
import { pipeline } from 'node:stream';

const request = http.get({ host: 'example.com',
path: '/',
port: 80,
headers: { 'Accept-Encoding': 'br,gzip,deflate' } });
request.on('response', (response) => {
const output = fs.createWriteStream('example.com_index.html');

const onError = (err) => {
if (err) {
console.error('An error occurred:', err);
process.exitCode = 1;
}
};

switch (response.headers['content-encoding']) {
case 'br':
pipeline(response, zlib.createBrotliDecompress(), output, onError);
break;
// Or, just use zlib.createUnzip() to handle both of the following cases:
case 'gzip':
pipeline(response, zlib.createGunzip(), output, onError);
break;
case 'deflate':
pipeline(response, zlib.createInflate(), output, onError);
break;
default:
pipeline(response, output, onError);
break;
}
});
```

```cjs
// Client request example
const zlib = require('node:zlib');
const http = require('node:http');
Expand Down Expand Up @@ -177,7 +313,52 @@ request.on('response', (response) => {
});
```

```js
```mjs
// server example
// Running a gzip operation on every request is quite expensive.
// It would be much more efficient to cache the compressed buffer.
import zlib from 'node:zlib';
import http from 'node:http';
import fs from 'node:fs';
import { pipeline } from 'node:stream';

http.createServer((request, response) => {
const raw = fs.createReadStream('index.html');
// Store both a compressed and an uncompressed version of the resource.
response.setHeader('Vary', 'Accept-Encoding');
const acceptEncoding = request.headers['accept-encoding'] || '';

const onError = (err) => {
if (err) {
// If an error occurs, there's not much we can do because
// the server has already sent the 200 response code and
// some amount of data has already been sent to the client.
// The best we can do is terminate the response immediately
// and log the error.
response.end();
console.error('An error occurred:', err);
}
};

// Note: This is not a conformant accept-encoding parser.
// See https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.3
if (/\bdeflate\b/.test(acceptEncoding)) {
response.writeHead(200, { 'Content-Encoding': 'deflate' });
pipeline(raw, zlib.createDeflate(), response, onError);
} else if (/\bgzip\b/.test(acceptEncoding)) {
response.writeHead(200, { 'Content-Encoding': 'gzip' });
pipeline(raw, zlib.createGzip(), response, onError);
} else if (/\bbr\b/.test(acceptEncoding)) {
response.writeHead(200, { 'Content-Encoding': 'br' });
pipeline(raw, zlib.createBrotliCompress(), response, onError);
} else {
response.writeHead(200, {});
pipeline(raw, response, onError);
}
}).listen(1337);
```

```cjs
// server example
// Running a gzip operation on every request is quite expensive.
// It would be much more efficient to cache the compressed buffer.
Expand Down Expand Up @@ -315,7 +496,43 @@ quality, but can be useful when data needs to be available as soon as possible.
In the following example, `flush()` is used to write a compressed partial
HTTP response to the client:

```js
```mjs
import zlib from 'node:zlib';
import http from 'node:http';
import { pipeline } from 'node:stream';

http.createServer((request, response) => {
// For the sake of simplicity, the Accept-Encoding checks are omitted.
response.writeHead(200, { 'content-encoding': 'gzip' });
const output = zlib.createGzip();
let i;

pipeline(output, response, (err) => {
if (err) {
// If an error occurs, there's not much we can do because
// the server has already sent the 200 response code and
// some amount of data has already been sent to the client.
// The best we can do is terminate the response immediately
// and log the error.
clearInterval(i);
response.end();
console.error('An error occurred:', err);
}
});

i = setInterval(() => {
output.write(`The current time is ${Date()}\n`, () => {
// The data has been passed to zlib, but the compression algorithm may
// have decided to buffer the data for more efficient compression.
// Calling .flush() will make the data available as soon as the client
// is ready to receive it.
output.flush();
});
}, 1000);
}).listen(1337);
```

```cjs
const zlib = require('node:zlib');
const http = require('node:http');
const { pipeline } = require('node:stream');
Expand Down