kindling 0.3.0
CRAN release: 2026-03-03
New Experimental functions
-
Generalized
nn_module()expression generator to generatetorch::nn_module()expression for the same sequential NN architectures- This is how you use to generate
nn_module()for 1D-CNN (Convolutional Neural Networks) with 3 hidden layers:
nn_module_generator( nn_name = "CNN1DClassifier", nn_layer = "nn_conv1d", layer_arg_fn = ~ if (.is_output) { list(.in, .out) } else { list( in_channels = .in, out_channels = .out, kernel_size = 3L, stride = 1L, padding = 1L ) }, after_output_transform = ~ .$mean(dim = 2), last_layer_args = list(kernel_size = 1, stride = 2), hd_neurons = c(16, 32, 64), no_x = 1, no_y = 10, activations = "relu" ) - This is how you use to generate
-
train_nn()to executenn_module_generator()-
nn_arch()must be supplied to inherit extra arguments fromnn_module_generator()function. - Allows early stopping if
early_stoppingis supplied withearly_stop(). - Supported with several data types:
matrix,data.frame,dataset(torch dataset), and a formula interface. -
train_nnsnip()is now provided to bridgetrain_nn()with tidymodels
-
-
You can supply customized activation function under
act_funs()withnew_act_fn().- Activation functions that especially don’t exist on
torch::nnf_*(). - Supply the argument with a function
- The function supplied into
new_act_fn()must return atorchtensor object. - Example:
act_funs(new_act_fn(torch::torch_tanh))oract_funs(new_act_fn(\(x) torch::torch_tanh(x))) - Use
.nameas a displayed name of the custom activation function.
- Activation functions that especially don’t exist on
Superset
-
act_funs()as a DSL function now supports index-style parameter specification for parametric activation functions- Activation functions can now be modified using
[syntax (e.g.softplus[beta = 0.2]) - The current
args()(e.g.softplus = args(beta = 0.2)) is now superseded by that.
- Activation functions can now be modified using
Bug Fixes
No suffix generated for
13byordinal_gen(). Now fixed.hd_neuronsfor bothffnn_generator()andrnn_generator()accepts empty arguments, which implies there’s no hidden layers applied.
kindling 0.2.0
CRAN release: 2026-02-04
New features
-
Added regularization support for neural network models
- L1 regularization (Lasso) for feature selection via
mixture = 1 - L2 regularization (Ridge) for weight decay via
mixture = 0 - Elastic Net combining L1 and L2 penalties via
0 < mixture < 1 - Controlled via
penalty(regularization strength) andmixture(L1/L2 balance) parameters - Follows tidymodels conventions for consistency with
glmnetand other packages
- L1 regularization (Lasso) for feature selection via
n_hlayers()now fully supports tuning the number of hidden layers-
hidden_neurons()gains support for discrete values via thedisc_valuesargument- e.g.
disc_values = c(32L, 64L, 128L, 256L)) is now allowed - This allows tuning over specific common hidden unit sizes instead of (or in addition to) a continuous range
- e.g.
Implementation fixes
-
Tuning methods and
grid_depth()is now fixed- Parameter space for the number of hidden layers is now fixed and active
- Corrected parameter space handling for
n_hlayers(no more invalid sampling whenx > 1) - Uses
tidyr::expand_grid(), notpurrr::cross*() - Fix randomization of parameter space which will produce NAs outside from kindling‘s own ’dials’
- No more list columns when
n_hlayers = 1
The supported models now use
hardhat::mold(), instead ofmodel.frame()andmodel.matrix().
Documentation
Add a vignette to showcase the comparison with other similar packages
The package description has been clarified
Vignette to showcase the comparison with other similar packages
-
hidden_neuronsparameter now supports discrete values specification- Users can specify exact neuron counts via
valuesparameter (e.g.,hidden_neurons(values = c(32, 64, 128))) - Maintains backward compatibility with range-based parameters (e.g.,
hidden_neurons(range = c(8L, 512L))/hidden_neurons(c(8L, 512L)))
- Users can specify exact neuron counts via
Added
\valuedocumentation tokindling-nn-wrappersfor CRAN complianceDocumented argument handling and list-column unwrapping in tidymodels wrapper functions
Clarified the relationship between
grid_depth()and wrapper functions
kindling 0.1.0
CRAN release: 2026-01-31
- Initial CRAN release
- Higher-level interface for torch package to define, train, and tune neural networks
- Support for feedforward (multi-layer perceptron) and recurrent networks (RNN, LSTM, GRU)
- Integration with tidymodels ecosystem (parsnip, workflows, recipes, tuning)
- Variable importance plots and network visualization tools
