library("gamlss2")
## load data
data("SpirometryUS")
## subset for female
d <- subset(SpirometryUS, gender == "Female")
## note, inner weight are sampled, set
## the seed for reprodicibility
set.seed(1328)
## formula for all 4 parameters
f <- fev1 ~ elm(age,k=100,a="tanh") | . | . | .
## estimate model
m <- gamlss2(f, data = d, family = BCT)
## estimated effects
plot(m)
## predict quantiles
qu <- c(0.025, seq(0.1, 0.9, by = 0.1), 0.975)
fit <- quantile(m, probs = qu)
## plot
plot(fev1 ~ age, data = d)
i <- order(d$age)
matplot(d$age[i], fit[i, ],
type = "l", lty = 1, lwd = 2,
col = c(2, rep(4, ncol(fit) - 1)),
add = TRUE)
## main effects and interactions
f <- fev1 ~ s(age) + s(height) + s(weight) +
elm(~age+height+weight,k=200) | . | . | .
m <- gamlss2(f, data = d, family = BCT)
## summary to inspect effect of interactions
## of the elm()s
summary(m)
## plot main effects
plot(m)
## quantile residuals
plot(m, which = "resid")
## prediction is handled automatically via
## the special term interface
n <- 50
nd <- with(d, expand.grid(
"age" = seq(min(age), max(age), length = n),
"weight" = seq(min(weight), max(height), length = n)
))
nd$height <- mean(d$height)
## compute lower 2.5
nd$fit <- quantile(m, newdata = nd, probs = 0.025)
## visualize
n <- length(unique(nd$age))
age <- sort(unique(nd$age))
weight <- sort(unique(nd$weight))
z <- matrix(nd$fit, n, n)
image(age, weight, z,
col = hcl.colors(100, "YlOrRd"),
xlab = "age", ylab = "weight")
contour(age, weight, z, add = TRUE)Extreme Learning Machine Model Terms
Description
Constructor function for Extreme Learning Machine (ELM) model terms for GAMLSS.
Usage
## Model term constructor function.
elm(x, k = 50, a = "tanh", ...)
Arguments
x
|
A numeric vector or matrix, a factor, or a formula. If x is a formula, a design matrix is created using model.matrix. See the examples.
|
k
|
Integer, number of hidden units (random features) used to build the ELM design matrix. |
a
|
Character, activation function for the hidden units. Supported options are “logistic”, “tanh” (default), “relu”, “leaky_relu”, “elu”, “softplus”, “atan”, “softsign”, “gaussian”, “laplace”, “sine”, and “identity”.
|
…
|
Further control arguments can be passed: criterion = “bic” (default) for shrinkage parameter selection and scale = TRUE (default) for internal scaling of the design matrix. Further arguments are passed to model.matrix if x is specified using a formula.
|
Details
The ELM term constructs a randomized single-hidden-layer representation of the covariate(s) in x. Internally, a design matrix Z is built (including an intercept column), random weights are sampled, and the linear predictors Z %*% W are transformed through the activation function a to obtain the hidden-layer design matrix X. Columns of X are centered before estimation.
Internal scaling. For numerical stability and comparability across terms, Z can be scaled internally (scale = TRUE). Intercept columns are left unchanged. If x is a factor, a QR-based group scaling is applied; otherwise, a column-wise center-and-scale normalization is used. The scaling transformation is stored and is reused automatically during prediction.
Activation functions. The activation is applied element-wise to Z %*% W. For numerical stability, bounded activations are evaluated on clipped inputs (currently \([-35, 35]\)).
Reproducibility. Weights are sampled randomly. Use set.seed before calling elm() if you need reproducible results
Value
An object representing a specials model term, used internally by gamlss2 during model fitting and prediction.
References
Huang GB, Zhu QY, Siew CK (2006). Extreme Learning Machine: Theory and Applications. Neurocomputing, 70(1–3), 489–501. doi:10.1016/j.neucom.2005.12.126
Huang GB, Zhu QY, Siew CK (2004). Extreme Learning Machine: A New Learning Scheme of Feedforward Neural Networks. In: Proceedings of IJCNN 2004, 2, 985–990. doi:10.1109/IJCNN.2004.1380068
See Also
gamlss2, specials.