Skip to content

Commit 8245d76

Browse files
committed
Improving functions documentation
1 parent a8abf78 commit 8245d76

File tree

3 files changed

+0
-19
lines changed

3 files changed

+0
-19
lines changed

R/create_keras_spec_helpers.R

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,6 @@
2323
#' @param layer_blocks A named list of functions defining Keras layer blocks.
2424
#' @param functional A logical. If `TRUE`, uses discovery logic for the
2525
#' Functional API. If `FALSE`, uses logic for the Sequential API.
26-
#' @param global_args A character vector of global arguments to add to the
27-
#' specification (e.g., "epochs").
2826
#' @return A list containing two elements:
2927
#'
3028
#' @noRd

R/generic_functional_fit.R

Lines changed: 0 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -36,15 +36,6 @@
3636
#' @param y A vector of outcomes.
3737
#' @param layer_blocks A named list of layer block functions. This is passed
3838
#' internally from the `parsnip` model specification.
39-
#' @param epochs An integer for the number of training iterations.
40-
#' @param learn_rate A double for the learning rate, used to configure the
41-
#' default Adam optimizer.
42-
#' @param batch_size An integer for the number of samples per gradient update.
43-
#' This is a tunable parameter and is passed to `keras3::fit()`.
44-
#' @param validation_split The proportion of the training data to use for the
45-
#' validation set.
46-
#' @param verbose An integer for the verbosity of the fitting process (0, 1, or
47-
#' 2).
4839
#' @param ... Additional arguments passed down from the model specification.
4940
#' These can include:
5041
#' \itemize{

R/generic_sequential_fit.R

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -32,14 +32,6 @@
3232
#' @param y A vector of outcomes.
3333
#' @param layer_blocks A named list of layer block functions. This is passed
3434
#' internally from the `parsnip` model specification.
35-
#' @param epochs An integer for the number of training iterations.
36-
#' @param learn_rate A double for the learning rate, used to configure the
37-
#' default Adam optimizer.
38-
#' @param batch_size An integer for the number of samples per gradient update.
39-
#' This is a tunable parameter and is passed to `keras3::fit()`.
40-
#' @param validation_split The proportion of the training data to use for
41-
#' the validation set.
42-
#' @param verbose An integer for the verbosity of the fitting process (0, 1, or 2).
4335
#' @param ... Additional arguments passed down from the model specification. These
4436
#' can include:
4537
#' \itemize{

0 commit comments

Comments
 (0)