Confusion Matrix of Bayesian Network
I'm trying to understand bayesian network. I have a data file 开发者_开发问答which has 10 attributes, I want to acquire the confusion table of this data table ,I thought I need to calculate tp,fp, fn, tn of all fields. Is it true ? if it's then what i need to do for bayesian network.
Really need some guidance, I'm lost.
The process usually goes like this:
- You have some labeled data instances which you want to use to train a classifier, so that it can predict the class of new unlabeled instances.
- Using your classifier of choice (neural networks, bayes net, SVM, etc...) we build a model with your training data as input.
- At this point, you usually would like to evaluate the performance of the model before deploying it. So using a previously unused subset of the data (test set), we compare the model classification for these instances against that of the actual class. A good way to summarize these results is by a confusion matrix which shows how each class of instances is predicted.
For binary classification tasks, the convention is to assign one class as positive, and the other as negative. Thus from the confusion matrix, the percentage of positive instances that are correctly classified as positive is know as the True Positive (TP) rate. The other definitions follows the same convention...
Confusion matrix is used to evaluate the performance of a classifier, any classifier.
What you are asking is a confusion matrix with more than two classes. Here is the steps how you do:
- Build a classifier for each class, where the training set consists of the set of documents in the class (positive labels) and its complement (negative labels).
- Given the test document, apply each classifier separately.
- Assign the document to the class with the maximum score, the maximum confidence value, or the maximum probability
Here is the reference for the paper you can have more information:
Picca, Davide, Benoît Curdy, and François Bavaud.2006.Non-linear correspondence analysis in text retrieval: A kernel view. In Proc. JADT.
精彩评论