implementing it was kinda fun. MNIST comes in some arbitrary order, so i first arg-sorted samples by class, then interleaved them (0 to 9 repeating) by swapping axes 0 and 1. after that i can flatten, roll, and split the samples into training and validation sets for each fold.
however this only works when all the classes have an equal number of samples. also, my batch training function shuffles by default, so i omitted it from the procedure.
ML/numpy rambling Afficher plus
implementing it was kinda fun. MNIST comes in some arbitrary order, so i first arg-sorted samples by class, then interleaved them (0 to 9 repeating) by swapping axes 0 and 1. after that i can flatten, roll, and split the samples into training and validation sets for each fold.
however this only works when all the classes have an equal number of samples. also, my batch training function shuffles by default, so i omitted it from the procedure.