Rstg: STG : Feature Selection using STochastic Gates

'STG' is a method for feature selection in neural network. The procedure is based on probabilistic relaxation of the l0 norm of features, or the count of the number of selected features. The framework simultaneously learns either a nonlinear regression or classification function while selecting a small subset of features. Read more: Yamada et al. (2020) <>.

Version: 0.0.1
Imports: reticulate (≥ 1.4)
Published: 2021-12-13
Author: Yutaro Yamada [aut, cre]
Maintainer: Yutaro Yamada <yutaro.yamada at>
License: MIT + file LICENSE
NeedsCompilation: no
Materials: NEWS
CRAN checks: Rstg results


Reference manual: Rstg.pdf


Package source: Rstg_0.0.1.tar.gz
Windows binaries: r-devel:, r-release:, r-oldrel:
macOS binaries: r-release (arm64): Rstg_0.0.1.tgz, r-oldrel (arm64): Rstg_0.0.1.tgz, r-release (x86_64): Rstg_0.0.1.tgz, r-oldrel (x86_64): Rstg_0.0.1.tgz


Please use the canonical form to link to this page.