-
-
Save pb111/cc341409081dffa5e9eaf60d79562a03 to your computer and use it in GitHub Desktop.
From the Feature Importance graph, Delicassen has the highest F score. Doesn't this mean that Delicassen was the most important feature as opposed to Grocery which was fourth best?
No answer to malambomutila comment?
Excellent case study!
I was only able to get an accuracy of about 93% with xgboost.
Thanks a lot
I like how you described each parameter meaning I had no idea you could use Drop out using D.A.R.T
From the Feature Importance graph, Delicassen has the highest F score. Doesn't this mean that Delicassen was the most important feature as opposed to Grocery which was fourth best?
yes u are right @malambomutila
nice work
your code is goat
Small correction: under Command line parameters I think reg:logistic is meant for classification problems with probabilities and binary:logistic is for classification problems with only decision, not the other way around. Great notebook, cheers.
-Nice. It is helpful to run in Jupyter Notebook. Thank you