| rkpk {gss} | R Documentation |
Perform numerical calculations for the ssanova and
gssanova suites.
sspreg1(s, r, q, y, method, alpha, varht, random) mspreg1(s, r, q, y, method, alpha, varht, random) sspngreg(family, s, r, q, y, wt, offset, alpha, nu, random) mspngreg(family, s, r, q, y, wt, offset, alpha, nu, random) ngreg(dc, family, sr, q, y, wt, offset, nu, alpha) ngreg.proj(dc, family, sr, q, y0, wt, offset, nu)
family |
Description of the error distribution. Supported
are exponential families "binomial", "poisson",
"Gamma", and "nbinomial". Also supported are
accelerated life model families "weibull",
"lognorm", and "loglogis". |
s |
Unpenalized terms evaluated at data points. |
r |
Basis of penalized terms evaluated at data points. |
q |
Penalty matrix. |
y |
Response vector. |
wt |
Model weights. |
offset |
Model offset. |
method |
"v" for GCV, "m" for GML, or "u"
for Mallows' CL. |
alpha |
Parameter modifying GCV or Mallows' CL scores for smoothing parameter selection. |
nu |
Optional argument for future support of nbinomial, weibull, lognorm, and loglogis families. |
varht |
External variance estimate needed for method="u". |
random |
Input for parametric random effects in nonparametric mixed-effect models. |
dc |
Coefficients of fits. |
sr |
cbind(s,r). |
y0 |
Components of the fit to be projected. |
sspreg1 is used by ssanova to compute
regression estimates with a single smoothing parameter.
mspreg1 is used by ssanova to compute
regression estimates with multiple smoothing parameters.
ssngpreg is used by gssanova to compute
non-Gaussian regression estimates with a single smoothing
parameter. mspngreg is used by gssanova to
compute non-Gaussian regression estimates with multiple smoothing
parameters. ngreg is used by ssngpreg and
mspngreg to perform Newton iteration with fixed smoothing
parameters and to calculate cross-validation scores on return.
ngreg.proj is used by project.gssanova to
calculate the Kullback-Leibler projection for non-Gaussian
regression.
Kim, Y.-J. and Gu, C. (2004), Smoothing spline Gaussian regression: more scalable computation via efficient approximation. Journal of the Royal Statistical Society, Ser. B, 66, 337–356.