- Anisotropic tensor-product operators
b1 %A0% b2andb1 %Xa0% b2now also work whenlambdais specified forb1anddfis specified forb2(or vice versa).
- New function
clr()to compute the centered-log-ratio transform and its inverse for density-on-scalar regression in Bayes spaces. - New dataset
birthDistribution. - New vignette illustrating density-on-function regression on the
birthDistributiondata. - Function
factorize()added for tensor-product factorization of estimated effects or models.
- Fix
predict()forbsignal()withnewdataand the functional covariate given as a numeric matrix, raised in #17. - Deprecated argument
LINPACKinsolve()removed.
- It is now possible to specify several time variables as well as factor time variables in the
timeformula. This feature is needed for the manifoldboost package.
- The function
stabsel.FDboost()now usesapplyFolds()instead ofvalidateFDboost()to do cross-validation with recomputation of the smooth offset. This is only relevant for models with a functional response. This will change results if the model contains base-learners likebbsc()orbolsc(), asapplyFolds()also recomputes the Z-matrix for those base-learners.
- Adapted functions
integrationWeights()andintegrationWeightsLeft()for unsorted time variables. - Changed code in
predict.FDboost()such that interaction effects of two functional covariates likebsignal() %X% bsignal()can be predicted with new data. - Adapt FDboost to R 4.0.1 by explicitly using the first entry of
dots$aggregate(i.e.,dots$aggregate[1] != "sum") inpredict.FDboost()so that it also works with the default, whereaggregateis a vector of length 3 and later only the first argument is used viamatch.arg().
- Deprecated argument
correctedincvrisk()removed.
cvrisk()has by default adequate folds for a noncyclic fitted FDboostLSS model, see #14.
- Replaced
cBind()(which is deprecated) withcbind().
- New function
bootstrapCI()to compute bootstrapped coefficients. - Added the dataset
emotioncontaining EEG and EMG measures under different experimental conditions. - With scalar response,
FDboost()now works with the response as a vector (instead of a 1-row matrix); thus,fitted()andpredict()return a vector.
update.FDboost()now works with a scalar response.FDboost()works with familyBinomial(type = "glm"), see #1.applyFolds()works for factor response, see #7.cvLong()andcvMA()return a matrix for only one resampling fold withB = 1(proposed by Almond Stoecker).
- Adapt
FDboosttomboost2.8-0, which allows formstop = 0. - Restructure
FDboostLSS()such that it callsmboostLSS_fit()fromgamboostLSS2.0-0. - In
FDboost, setoptions("mboost_indexmin" = +Inf)to disable internal use of ties in model fitting, as this breaks some methods for models with responses in long format and for models containingbhistx(), see #10. - Deprecated
validateFDboost(), useapplyFolds()andbootstrapCI()instead.
- Added function
applyFolds()to compute the optimal stopping iteration.
- Allows for extrapolation in
predict()withbbsc().
- Fixed a bug in
bolsc(): correctly use the index inbolsc()/bbsc(). Previously, each observation was used only once for computing Z.
- Added function
%Xa0%that computes a row-tensor product of two base-learners where the penalty in one direction is zero. - Added function
reweightData()that computes the data for Bootstrap or cross-validation folds. - Added function
stabsel.FDboost()that refits the smooth offset in each fold. - Added argument
funtovalidateFDboost(). - Added
update.FDboost()that overwritesupdate.mboost().
FDboost()works withfamily = Binomial().
- Fixed
oobpredinvalidateFDboost()for irregular response and resampling at the curve level so thatplot.validateFDboost()works for that case. - Fixed scope of formula in
FDboost(): now the formula given tomboost()withinFDboost()uses the variables in the environment of the formula specified inFDboost().
plot.FDboost()works for more effects, especially for effects likebolsc() %X% bhistx().
- New operator
%A0%for Kronecker product of two base-learners with an anisotropic penalty for the special case wherelambda1orlambda2is zero. - The base-learner
bbsc()can be used withcenter = TRUE(derived by Almond Stoecker). - In
FDboostLSS(), a list of one-sided formulas can be specified fortimeformula.
FDboostLSS()works withfamilies = GammaLSS().
- Operator
%A%uses weights in the model call. This only works correctly for weights on the level ofblg1andblg2(same as weights on rows and columns of the response matrix). - Calls to internal functions of
mboostare done usingmboost_intern(). hyper_olsc()is based onhyper_ols()frommboost.
- Changed the operator
%Xc%for the row tensor product of two scalar covariates. The design matrix of the interaction effects is constrained such that the interaction is centered around the intercept and around the two main effects of the scalar covariates (experimental!). Use, for example,bols(x1) %Xc% bols(x2).
- Changed the operator
%Xc%for row tensor product where the sum-to-zero constraint is applied to the design matrix resulting from the row-tensor product (experimental!). Specifically, an intercept-column is first added, and then the sum-to-zero constraint is applied. Use, for example,bolsc(x1) %Xc% bolsc(x2). - The functional index
sis now used asargsvalsin the FPCA conducted withinbfpc().
- New operator
%A%that implies anisotropic penalties for differently specifieddfin the two base-learners.
- No penalty is applied in the direction of
ONExin a smooth intercept specified implicitly by~1, for example,bols(ONEx, intercept=FALSE, df=1) %A% bbs(time).
- Effects containing
%A%or%O%are not expanded with thetimeformula, allowing for different effects over time in the model.
- Added the function
FDboostLSS()to fit GAMLSS models with functional data using R-packagegamboostLSS. - New operator
%Xc%for row tensor product where the sum-to-zero constraint is applied to the design matrix resulting from the row-tensor product (experimental!). - Allowed
newdatato be a list inpredict.FDboost()when used with signal base-learners. - Expanded
coef.FDboost()so that it works for 3-dimensional tensor products of the formbhistx() %X% bolsc() %X% bolsc()(with David Ruegamer). - Added a new possibility for scalar-on-function regression: if
timeformula=NULL, no Kronecker product with1is used, which changes the penalty (otherwise, the direction of1would also be penalized).
- New dependency on R-package
gamboostLSS. - Removed dependency on R-package
MASS. - Used the argument
predictionin the internal computation of the base-learners (work in progress). - Throw an error if
timeLabof thehmatrix-object inbhistx()is not equal to the time variable intimeformula.
- In function
FDboost(), the offset is supplied differently. For a scalar offset, useoffset = "scalar". The default remainsoffset = NULL. predict.FDboost()has a new argumenttoFDboost(logical).fitted.FDboost()has argumenttoFDboostexplicitly (not only via...).- New base-learner
bhistx(), especially suited for effects used with%X%, e.g.,bhistx() %X% bolsc(). coef.FDboost()andplot.FDboost()now handle effects likebhistx() %X% bolsc().- For
predict.FDboost()with effectsbhistx()and newdata, the latestmboostPatchis necessary.
- The check for the necessity of a smooth offset works for missing values in a regular response (spotted by Tore Erdmann).
- Internal experimental version.
integrationWeights()now gives equal weights for regular grids.- New base-learner
bfpc()for a functional covariate where both the functional covariate and the coefficient are expanded using fPCA (experimental feature!). Only works for regularly observed functional covariate.
coef.FDboost()only works forbhist()if the time variable is the same in the timeformula and inbhist().predict.FDboost()now checks that onlytype = "link"can be predicted for newdata.
- Changed the default difference penalties to first-order difference (
differences = 1), improving identifiability. - New method
cvrisk.FDboost()that uses (by default) sampling on the levels of curves, which is important for functional responses. - Reorganized documentation of
cvrisk()andvalidateFDboost(). - In
bhist(), an effect can be standardized.
- Added a
CITATIONfile. - Uses
mboost 2.4-2, which exports all important functions.
mainargument is always passed inplot.FDboost().bhist()andbconcurrent()now work for equaltimeands.predict.FDboost()works with tensor-product base-learners likebl1 %X% bl2.