Introduction
kindling bridges the gap between torch and tidymodels, providing a streamlined interface for building, training, and tuning deep learning models. This vignette will guide you through the basic usage.
Installation
You can install kindling on CRAN:
install.packages('kindling')Or install the development version from GitHub:
# install.packages("pak")
pak::pak("joshuamarie/kindling")
## devtools::install_github("joshuamarie/kindling") Before using {kindling}
Before starting, you need to install LibTorch, the backend of PyTorch which also the backend of torch R package:
torch::install_torch()Four Levels of Interaction
kindling offers flexibility through four levels of abstraction:
-
Code Generation - Generate raw
torch::nn_modulecode - Direct Training - Train models with simple function calls
-
tidymodels Integration - Use with
parsnip,recipes, andworkflows -
Hyperparameter Tuning - Optimize models with
tuneanddials
Level 1: Code Generation
Generate PyTorch-style module code:
ffnn_generator(
nn_name = "MyNetwork",
hd_neurons = c(64, 32),
no_x = 10,
no_y = 1,
activations = 'relu'
)Level 3: tidymodels Integration
Work with neural networks like any other parsnip
model:
box::use(
parsnip[fit, augment],
yardstick[metrics]
)
nn_spec = mlp_kindling(
mode = "classification",
hidden_neurons = c(10, 7),
activations = act_funs(relu, softshrink = args(lambd = 0.5)),
epochs = 100
)
nn_fit = fit(nn_spec, Species ~ ., data = iris)
augment(nn_fit, new_data = iris) |>
metrics(truth = Species, estimate = .pred_class)Learn More
- Visit the package website: https://kindling.joshuamarie.com
- Report issues: https://github.com/joshuamarie/kindling/issues
