Man, we should have teamed up.
I generated featues ranginf rm std ,mean median etc to root mean square, acceleration . similar set of features from !-1 helped nicely. Rollign max ,mean etc on acceleration.
minmax features( max - abs(min)).
rms and acceleration were nice.
I used features from a mlp/CNN layer which helped but I dropped them in the end because of overfitting…
Finally few tricks which helped:
Discard all probs less than 0.05 add their residuals to the highest prob for that sample.
taking 5-6 top submissions and mean across them
both tricks helped in 0.25 around improvement.
All with ExtraTreesClassifier. Entropy helped vs Gini.
Coudn’t get XGBoost working till the end. Extra trees outperformed it. OnevsAll also helped.
Local Crossvalidation were off by 0.3
Most of work happened in last week.
But Should have teamed up from top rankers. Would have learned much more .