predict.textmodel_nb()
implements class predictions from a fitted
Naive Bayes model. using trained Naive Bayes examples
# S3 method for textmodel_nb predict(object, newdata = NULL, ...)
object | a fitted Naive Bayes textmodel |
---|---|
newdata | dfm on which prediction should be made |
... | not used |
predict.textmodel_nb
returns a list of two data frames, named
docs
and words
corresponding to word- and document-level
predicted quantities
data frame with document-level predictive quantities: nb.predicted, ws.predicted, bs.predicted, PcGw, wordscore.doc, bayesscore.doc, posterior.diff, posterior.logdiff. Note that the diff quantities are currently implemented only for two-class solutions.
data-frame with word-level predictive quantities: wordscore.word, bayesscore.word
# application to LBG (2003) example data (nb <- textmodel_nb(data_dfm_lbgexample, c("A", "A", "B", "C", "C", NA)))#> #> Call: #> textmodel_nb.dfm(x = data_dfm_lbgexample, y = c("A", "A", "B", #> "C", "C", NA)) #> #> Distribution: multinomial; prior: uniform; smoothing value: 1; 5 training documents; 37 fitted features.predict(nb)#> $log.posterior.lik #> A B C #> R1 -2687.853 -6472.926 -7614.264 #> R2 -2687.853 -4013.332 -7147.946 #> R3 -4671.788 -2368.923 -4671.788 #> R4 -7147.946 -4013.332 -2687.853 #> R5 -7614.264 -6472.926 -2687.853 #> V1 -3212.036 -3007.763 -6381.702 #> #> $posterior.prob #> A B C #> R1 1.000000e+00 0 0 #> R2 1.000000e+00 0 0 #> R3 0.000000e+00 1 0 #> R4 0.000000e+00 0 1 #> R5 0.000000e+00 0 1 #> V1 1.929374e-89 1 0 #> #> $nb.predicted #> [1] "A" "A" "B" "C" "C" "B" #> #> $Pc #> A B C #> 0.3333333 0.3333333 0.3333333 #> #> $classlabels #> [1] "A" "B" "C" #> #> $call #> predict.textmodel_nb(object = nb) #>