pnnCMA {CMA} | R Documentation |
Probabilistic Neural Networks is the term Specht (1990) used for a Gaussian kernel estimator for the conditional class densities.
For S4
method information, see pnnCMA-methods.
pnnCMA(X, y, f, learnind, sigma = 1,models=FALSE)
X |
Gene expression data. Can be one of the following:
|
y |
Class labels. Can be one of the following:
WARNING: The class labels will be re-coded to
range from |
f |
A two-sided formula, if |
learnind |
An index vector specifying the observations that
belong to the learning set. For this method, this
must not be |
sigma |
Standard deviation of the Gaussian Kernel used. This hyperparameter should be tuned, s. |
models |
a logical value indicating whether the model object shall be returned |
An object of class cloutput
.
There is actually no strong relation of this method to Feed-Forward
Neural Networks, s. nnetCMA
.
Martin Slawski ms@cs.uni-sb.de
Anne-Laure Boulesteix boulesteix@ibe.med.uni-muenchen.de
Specht, D.F. (1990).
Probabilistic Neural Networks. Neural Networks, 3, 109-118.
compBoostCMA
, dldaCMA
, ElasticNetCMA
,
fdaCMA
, flexdaCMA
, gbmCMA
,
knnCMA
, ldaCMA
, LassoCMA
,
nnetCMA
, pknnCMA
, plrCMA
,
pls_ldaCMA
, pls_lrCMA
, pls_rfCMA
,
qdaCMA
, rfCMA
,
scdaCMA
, shrinkldaCMA
, svmCMA
### load Golub AML/ALL data data(golub) ### extract class labels golubY <- golub[,1] ### extract gene expression from first 10 genes golubX <- as.matrix(golub[,2:11]) ### select learningset ratio <- 2/3 set.seed(111) learnind <- sample(length(golubY), size=floor(ratio*length(golubY))) ### run PNN pnnresult <- pnnCMA(X=golubX, y=golubY, learnind=learnind, sigma = 3) ### show results show(pnnresult) ftable(pnnresult) plot(pnnresult)