Skip to main content

Buffered FIFO queue for streaming bytes in python?

I am doing a lot of processing and generating of csv and json files in python, to which I'm new-ish. Here's what I have so far:

row_iter = await get_row_iterator() # basically returns a db cursor for streaming rows
gcs_object = self.gcs_client.bucket(self.env.temp_bucket).blob(manifest_key)

with gcs_object.open(mode="w", content_type="text/csv") as f:
    async for row_obj in row_iter:
        f.write(f"{row_obj.some_column},{row_obj.some_other_column}\n")

This works fine, but if I understand correctly, the problem is that there is a lot of waiting around. While I'm writing a row with f.write(), the row-reading iterator row_iter is just sitting there doing nothing. While I'm reading from the iterator, the writer is twiddling its thumbs.

While the way I've done this conserves memory, from a speed perspective its no better than reading all the rows into memory first, and then starting an upload. There should be no reason I can't be uploading data that has already been processed while also downloading and processing the next batch. I'd like to stream this process, so that the row iterator can be reading rows and stuffing them into a fixed size FIFO buffer as fast as it possibly can, while on the other end I'm reading from the buffer and writing the csv to GCS as fast as the network can handle. If the buffer fills up, writing to it should block until there is space. If the buffer empties, reading from it should block until there is enough to read so that the maximum amount of memory used by this pipeline is controlled.

It looks an awful lot like I should be able to do something like this with python's Buffered Streams, but I can't figure out how to use them to accomplish this, and I can't find any examples of how they should be used. I'll need to do this with csv, json, and probably other custom formats, so I'm not necessarily looking for a package that handles this for a specific format, but a more generic way, even a way to write my own streaming pipelines.



source https://stackoverflow.com/questions/75415211/buffered-fifo-queue-for-streaming-bytes-in-python

Comments

Popular posts from this blog

How to show number of registered users in Laravel based on usertype?

i'm trying to display data from the database in the admin dashboard i used this: <?php use Illuminate\Support\Facades\DB; $users = DB::table('users')->count(); echo $users; ?> and i have successfully get the correct data from the database but what if i want to display a specific data for example in this user table there is "usertype" that specify if the user is normal user or admin i want to user the same code above but to display a specific usertype i tried this: <?php use Illuminate\Support\Facades\DB; $users = DB::table('users')->count()->WHERE usertype =admin; echo $users; ?> but it didn't work, what am i doing wrong? source https://stackoverflow.com/questions/68199726/how-to-show-number-of-registered-users-in-laravel-based-on-usertype

Why is my reports service not connecting?

I am trying to pull some data from a Postgres database using Node.js and node-postures but I can't figure out why my service isn't connecting. my routes/index.js file: const express = require('express'); const router = express.Router(); const ordersCountController = require('../controllers/ordersCountController'); const ordersController = require('../controllers/ordersController'); const weeklyReportsController = require('../controllers/weeklyReportsController'); router.get('/orders_count', ordersCountController); router.get('/orders', ordersController); router.get('/weekly_reports', weeklyReportsController); module.exports = router; My controllers/weeklyReportsController.js file: const weeklyReportsService = require('../services/weeklyReportsService'); const weeklyReportsController = async (req, res) => { try { const data = await weeklyReportsService; res.json({data}) console...

How to split a rinex file if I need 24 hours data

Trying to divide rinex file using the command gfzrnx but getting this error. While doing that getting this error msg 'gfzrnx' is not recognized as an internal or external command Trying to split rinex file using the command gfzrnx. also install'gfzrnx'. my doubt is I need to run this program in 'gfzrnx' or in 'cmdprompt'. I am expecting a rinex file with 24 hrs or 1 day data.I Have 48 hrs data in RINEX format. Please help me to solve this issue. source https://stackoverflow.com/questions/75385367/how-to-split-a-rinex-file-if-i-need-24-hours-data