I started to upload resized dataset to Kaggle because:
a) it’s almost impossible for anybody to deal with 5TB of data
b) it’s absolutely unnecessary to have 2048x1536 images for such a problem
Images will have the size of 512x384 (EXIF is preserved)
Hi Pavel,
It looks like you have done a wonderful service to everyone here. Can I ask if you have documented the process, i.e., the algorithm(s) that you used to downsize the images? I assume that you reduced the resolution with some loss of information involved, and I guess if we train a model with the smaller images then we would probably want to duplicate that downsizing process with the test set as well.
BTW, I don’t see season 7 or 9 when I use the kaggle API (kaggle datasets list), but I do see them in your links above. Not sure why – maybe they have to be registered or something to appear?