previous post - Top 9 Rule : Code of conduct for Data Science professional
Xgboost is short for eXtreme Gradient Boosting package, XGBoost includes regression, classification and ranking.
Install XGBoost latest version from github
devtools::install_github('dmlc/xgboost',subdir='R-package')
Error
clf<-xgboost( data=data.matrix(train[,feature.names]), label = train$Survived, booster = "gblinear", nrounds = 20, objective = "multi:softprob", num_class = 2, eval_metric = "merror" )Error in xgb.iter.update(bst$handle, dtrain, i - 1, obj) : SoftmaxMultiClassObj: label must be in [0, num_class), num_class=2 but found 2 in label
Solution
needed to convert the response variable from factor to numeric.
train$Survived<-as.numeric(levels(train$Survived))[train$Survived]
also install package "libx11-dev"
solai@vm1$ sudo apt-get install libx11-dev then try again.
solai@vm1$ sudo apt-get install libx11-dev then try again.
2 comments:
At this point id verification service is really popular among persons. There are plenty of id verification methods that anybody can receive on a trustworthy site called Trust Swiftly, and a firm can utilize the methods to secure their own online business easily. By addressing this particular TRUSTSWIFTLY site, you can acquire more knowledge about id verification service.
Thanks for sharing.
I suspect this was a bug/issue that appeared in an early version of the xgboost r package.
XGboost is about 10 years old now and very mature software. I also see that in general the data science community has moved away from R toward python. This includes using xgboost in python, such as with the popular scikit-learn API.
Post a Comment