Is there anybody who tried to use multinomial logistic regression, multiclass linear discriminant analysis or multinomial Naive Bayes?

Hi everybody, I have tried to use Random Forest and I obtained a score close to 82%. I would like to improve that score, or at least have an idea of the perfomance of Random Forest virsus the performance of other methods. Which one of the method in the title do you think is more promising? I know that the perfomance of a method is strictly connected to the data cleaning and pre processing, but I would like to know if there is somebody who obtained comparable or better perfomance with respect to Random Forest, in some way, with any of these methods.

Hi, I used a gbm (gradient boosting ) and strangely got a worse score than with random forest.

Hi, I’m rather a fan of nearest neighbour classification, as a rule it’s fast and often surprisingly accurate. This dataset has mainly categorical variables with many levels, so that the standard kknn R package doesn’t work. knncat does, but only gave 75% correct classification, and was very memory-hungry.

i used lasso of glmnet but got very bad score