I'm trying to download a potentially large file by streaming. It must be accumulated into memory because the API I'm using requires a base64 upload. However, I'm thinking a stream might still be necessary because the implementation would be in a web server and I don't want to block the event loop. Any ideas on why this code might not be working? It seems like the chunks aren't being concatenated because I'm getting only the first piece of the response body when I debug.
import { Transform } from "stream";
import axios from "axios";
import fs from "fs/promises";
class Base64EncodeStream extends Transform {
private completePromise: Promise<string>;
constructor() {
super({
transform: (chunk, encoding, callback) => {
this.push(chunk.toString("base64"));
callback();
},
});
this.completePromise = new Promise((resolve, reject) => {
let data = "";
this.on("data", (chunk) => {
data += chunk;
});
this.on("error", (err) => {
reject(err);
});
this.on("end", () => {
resolve(data);
});
});
}
promise() {
return this.completePromise;
}
}
async function main() {
const res = await axios({
url: "https://www.google.com",
method: "get",
responseType: "stream",
});
const base64EncodeStream = new Base64EncodeStream();
res.data.pipe(base64EncodeStream);
const base64Encoded = await base64EncodeStream.promise();
await fs.writeFile("out.txt", base64Encoded, "utf8");
}
main();
Via Active questions tagged javascript - Stack Overflow https://ift.tt/1MXH57F
Comments
Post a Comment