I am creating a compression algorithm using python - however, when loading large files and datasets (usually 1 or 2 gb+), the process is killed do to running out of RAM. What tools and methods are available to allow saving memory, while also having the entire file available to access and iterate too. The current code is right here (dev branch).
source https://stackoverflow.com/questions/72020651/load-large-files-without-killing-process-python-memory-management
Comments
Post a Comment