Not much to see here eh?
A systematic benchmark of all Parquet compression algorithms across compression levels and dataset sizes.
Career advice I’ve learned through blood, sweat, and tears.
How to create and patch a memory leak using the metadata of a Parquet file.
Notes on learning deep learning
Distributed training explained using torch distributed.
Examples of rendered markdown and html elements.
My company’s holiday pub quiz almost had a question about summing two random variables.
How to calculate statistics on data streams or out-of-core datasets in a parallelizable way.
Using the walrus operator := to make life simpler.
:=