kernelpls.fit2.Rd
Fits a PLS regression model with the kernel algorithm (Dayal & Macgregor, 1997).
kernelpls.fit2(X, Y, ncomp)
# S3 method for kernelpls.fit2
predict(object,X, ...)
Matrix of regressors
Vector of a univariate outcome
Number of components to be extracted
Object of class kernelpls.fit2
Further arguments to be passed
The same list as in
{pls::kernelpls.fit}
is produced.
In addition, \(R^2\) measures are contained in
R2
.
Dayal, B., & Macgregor, J. F. (1997). Improved PLS algorithms. Journal of Chemometrics, 11(1), 73-85.
Mevik, B. H., & Wehrens, R. (2007). The pls package: Principal component and partial least squares regression in R. Journal of Statistical Software, 18, 1-24. doi:10.18637/jss.v018.i02
See the pls package for further estimation algorithms.
if (FALSE) {
#############################################################################
# SIMULATED EXAMPLE 1: 300 cases on 100 variables
#############################################################################
set.seed(789)
library(mvtnorm)
N <- 300 # number of cases
p <- 100 # number of predictors
rho1 <- .6 # correlations between predictors
# simulate data
Sigma <- base::diag(1-rho1,p) + rho1
X <- mvtnorm::rmvnorm( N, sigma=Sigma )
beta <- base::seq( 0, 1, len=p )
y <- ( X %*% beta )[,1] + stats::rnorm( N, sd=.6 )
Y <- base::matrix(y,nrow=N, ncol=1 )
# PLS regression
res <- miceadds::kernelpls.fit2( X=X, Y=Y, ncomp=20 )
# predict new scores
Xpred <- predict( res, X=X[1:10,] )
#############################################################################
# EXAMPLE 2: Dataset yarn from pls package
#############################################################################
# use kernelpls.fit from pls package
library(pls)
data(yarn,package="pls")
mod1 <- pls::kernelpls.fit( X=yarn$NIR, Y=yarn$density, ncomp=10 )
# use kernelpls.fit2 from miceadds package
Y <- base::matrix( yarn$density, ncol=1 )
mod2 <- miceadds::kernelpls.fit2( X=yarn$NIR, Y=Y, ncomp=10 )
}