I am relatively new to competitions with huge data like this.
Can I know how you people are working or what is the general etiquette to deal with huge datasets like this?
Do you rent cloud or have own local “DeepLearning” setup or kaggle or colab etc. ?
I do all my testing and development on colab, then once i know/ think its ready i will move it to something like a GCP instance or paperspace gradient.
Keen to see what others do!