diff options
author | Birte Kristina Friesel <birte.friesel@uos.de> | 2024-03-06 16:27:19 +0100 |
---|---|---|
committer | Birte Kristina Friesel <birte.friesel@uos.de> | 2024-03-06 16:27:19 +0100 |
commit | 29d10d5dd6c08bcafc7c34c48b8db599fcbd7e49 (patch) | |
tree | 8de6f67ffed23da7e99be12c87a2c28bdfedad0a /README.md | |
parent | 89568c6b4b9a35612c794431d551bc0cc638e46d (diff) |
Add LightGBM support
Diffstat (limited to 'README.md')
-rw-r--r-- | README.md | 13 |
1 files changed, 12 insertions, 1 deletions
@@ -45,6 +45,7 @@ dfatool supports six types of performance models: * CART: Regression Trees * DECART: Regression Trees with exclusively binary features/parameters +* LightGBM: Regressin Forests * XGB: Regression Forests * LMT: Linear Model Trees * RMT: [Regression Model Trees](https://ess.cs.uos.de/static/papers/Friesel-2022-CPSIoTBench.pdf) with [non-binary nodes](https://ess.cs.uos.de/static/papers/Friesel-2022-CAIN.pdf) @@ -111,10 +112,20 @@ The following variables may be set to alter the behaviour of dfatool components. | `DFATOOL_KCONF_WITH_CHOICE_NODES` | 0, **1** | Treat kconfig choices (e.g. "choice Model → MobileNet / ResNet / Inception") as enum parameters. If enabled, the corresponding boolean kconfig variables (e.g. "Model\_MobileNet") are not converted to parameters. If disabled, all (and only) boolean kconfig variables are treated as parameters. Mostly relevant for analyze-kconfig, eval-kconfig | | `DFATOOL_COMPENSATE_DRIFT` | **0**, 1 | Perform drift compensation for loaders without sync input (e.g. EnergyTrace or Keysight) | | `DFATOOL_DRIFT_COMPENSATION_PENALTY` | 0 .. 100 (default: majority vote over several penalties) | Specify penalty for ruptures.py PELT changepoint petection | -| `DFATOOL_MODEL` | cart, decart, fol, lmt, **rmt**, symreg, xgb | Modeling method. See below for method-specific configuration options. | +| `DFATOOL_MODEL` | cart, decart, fol, lgbm, lmt, **rmt**, symreg, xgb | Modeling method. See below for method-specific configuration options. | | `DFATOOL_RMT_SUBMODEL` | cart, fol, static, symreg, **uls** | Modeling method for RMT leaf functions. | | `DFATOOL_RMT_ENABLED` | 0, **1** | Use decision trees in get\_fitted | | `DFATOOL_CART_MAX_DEPTH` | **0** .. *n* | maximum depth for sklearn CART. Default (0): unlimited. | +| `DFATOOL_LGBM_BOOSTER` | **gbdt**, dart, rf | Boosting type. | +| `DFATOOL_LGBM_N_ESTIMATORS` | .., **100**, .. | Number of estimators. | +| `DFATOOL_LGBM_MAX_DEPTH` | **-1**, .., *n* | Maximum tree depth, unlimited if ≤ 0. | +| `DFATOOL_LGBM_NUM_LEAVES` | .., **31**, .. | Maximum number of leaves per tree. | +| `DFATOOL_LGBM_SUBSAMPLE` | 0.0 .. **1.0** | Subsampling ration. | +| `DFATOOL_LGBM_LEARNING_RATE` | 0 .. **0.1** .. 1 | Learning rate. | +| `DFATOOL_LGBM_MIN_SPLIT_GAIN` | **0.0** .. 1 | Minimum loss reduction required for a split. | +| `DFATOOL_LGBM_MIN_CHILD_SAMPLES` | .., **20**, .. | Minimum samples that each leaf of a split candidate must contain. | +| `DFATOOL_LGBM_REG_ALPHA` | **0.0** .. *n* | L1 regularization term on weights. | +| `DFATOOL_LGBM_REG_LAMBDA` | **0.0** .. *n* | L2 regularization term on weights. | | `DFATOOL_LMT_MAX_DEPTH` | **5** .. 20 | Maximum depth for LMT. | | `DFATOOL_LMT_MIN_SAMPLES_SPLIT` | 0.0 .. 1.0, **6** .. *n* | Minimum samples required to still perform an LMT split. A value below 1.0 sets the specified ratio of the total number of training samples as minimum. | | `DFATOOL_LMT_MIN_SAMPLES_LEAF` | 0.0 .. **0.1** .. 1.0, 3 .. *n* | Minimum samples that each leaf of a split candidate must contain. A value below 1.0 specifies a ratio of the total number of training samples. A value above 1 specifies an absolute number of samples. | |