Introduction
kindling bridges the gap between torch and tidymodels, providing a streamlined interface for building, training, and tuning deep learning models. This vignette will guide you through the basic usage.
Installation
# Install from GitHub
pak::pak("joshuamarie/kindling")Four Levels of Interaction
kindling offers flexibility through four levels of abstraction:
-
Code Generation - Generate raw
torch::nn_modulecode - Direct Training - Train models with simple function calls
-
tidymodels Integration - Use with
parsnip,recipes, andworkflows -
Hyperparameter Tuning - Optimize models with
tuneanddials
Level 1: Code Generation
Generate PyTorch-style module code:
ffnn_generator(
nn_name = "MyNetwork",
hd_neurons = c(64, 32),
no_x = 10,
no_y = 1,
activations = 'relu'
)Level 3: tidymodels Integration
Work with neural networks like any other parsnip
model:
box::use(
parsnip[fit, augment],
yardstick[metrics]
)
nn_spec = mlp_kindling(
mode = "classification",
hidden_neurons = c(10, 7),
activations = act_funs(relu, softshrink = args(lambd = 0.5)),
epochs = 100
)
nn_fit = fit(nn_spec, Species ~ ., data = iris)
augment(nn_fit, new_data = iris) |>
metrics(truth = Species, estimate = .pred_class)Learn More
- Read the README for comprehensive examples
- Browse the function reference
- Visit the blog for tutorials
