summaryrefslogtreecommitdiff
path: root/doc
diff options
context:
space:
mode:
authorBirte Kristina Friesel <birte.friesel@uos.de>2024-02-21 11:46:06 +0100
committerBirte Kristina Friesel <birte.friesel@uos.de>2024-02-21 11:46:06 +0100
commit5d83f255f05c3b74df0ace1f70b260959b392eca (patch)
tree19722b4889911cd3893990531fd62732f937e719 /doc
parentc3dbe93034bdeff9dba534d29b04daa527d70241 (diff)
update documentation and examples
Diffstat (limited to 'doc')
-rw-r--r--doc/analysis-nfp.md2
-rw-r--r--doc/modeling-method.md26
2 files changed, 16 insertions, 12 deletions
diff --git a/doc/analysis-nfp.md b/doc/analysis-nfp.md
index 877ac2a..5221c55 100644
--- a/doc/analysis-nfp.md
+++ b/doc/analysis-nfp.md
@@ -8,7 +8,7 @@ Classification and Regression Trees (CART) are capable of generating accurate mo
Hence, after loading a CART model into kconfig-webconf, only a small subset of busybox features will be annotated with NFP deltas.
```
-DFATOOL_DTREE_SKLEARN_CART=1 DFATOOL_PARAM_CATEGORICAL_TO_SCALAR=1 DFATOOL_KCONF_WITH_CHOICE_NODES=0 .../dfatool/bin/analyze-kconfig.py --export-webconf busybox.json --force-tree ../busybox-1.35.0/Kconfig .
+DFATOOL_MODEL=cart DFATOOL_PARAM_CATEGORICAL_TO_SCALAR=1 DFATOOL_KCONF_WITH_CHOICE_NODES=0 .../dfatool/bin/analyze-kconfig.py --export-webconf busybox.json --force-tree ../busybox-1.35.0/Kconfig .
```
Refer to the [kconfig-webconf README](https://ess.cs.uos.de/git/software/kconfig-webconf/-/blob/master/README.md#user-content-performance-aware-configuration) for details on using the generated model.
diff --git a/doc/modeling-method.md b/doc/modeling-method.md
index e4865d9..58fe03b 100644
--- a/doc/modeling-method.md
+++ b/doc/modeling-method.md
@@ -1,27 +1,22 @@
# Modeling Method Selection
+Set `DFATOOL_MODEL` to an appropriate value, e.g. `DFATOOL_MODEL=cart`.
+
## CART (Regression Trees)
-Enable these with `DFATOOL_DTREE_SKLEARN_CART=1` and `--force-tree`.
+sklearn CART ("Decision Tree Regression") algorithm. Uses binary nodes and supports splits on scalar variables.
### Related Options
* `DFATOOL_PARAM_CATEGORICAL_TO_SCALAR=1` converts categorical parameters (which are not supported by CART) to numeric ones.
-## XGB (Gradient-Boosted Forests / eXtreme Gradient boosting)
+## DECART (Regression Trees)
-Enable these with `DFATOOL_USE_XGBOOST=1` and `--force-tree`.
-You should also specify `DFATOOL_XGB_N_ESTIMATORS`, `DFATOOL_XGB_MAX_DEPTH`, and possibly `OMP_NUM_THREADS`.
-
-### Related Options
-
-* `DFATOOL_PARAM_CATEGORICAL_TO_SCALAR=1` converts categorical parameters (which are not supported by XGB) to numeric ones.
-* Anything prefixed with `DFATOOL_XGB_`.
+sklearn CART ("Decision Tree Regression") algorithm. Ignores scalar parameters, thus emulating the DECART algorithm.
## LMT (Linear Model Trees)
-Enable these with `DFATOOL_DTREE_LMT=1` and `--force-tree`.
-They always use a maximum depth of 20.
+[Linear Model Tree](https://github.com/cerlymarco/linear-tree) algorithm. Uses binary nodes and linear functions.
### Related Options
@@ -51,6 +46,15 @@ All of these are valid regression model trees.
* `DFATOOL_ULS_SKIP_CODEPENDENT_CHECK=1`
* `DFATOOL_REGRESSION_SAFE_FUNCTIONS=1`
+## XGB (Gradient-Boosted Forests / eXtreme Gradient boosting)
+
+You should also specify `DFATOOL_XGB_N_ESTIMATORS`, `DFATOOL_XGB_MAX_DEPTH`, and possibly `OMP_NUM_THREADS`.
+
+### Related Options
+
+* `DFATOOL_PARAM_CATEGORICAL_TO_SCALAR=1` converts categorical parameters (which are not supported by XGB) to numeric ones.
+* Anything prefixed with `DFATOOL_XGB_`.
+
## Least-Squares Regression
If dfatool determines that there is no need for a tree structure, or if `DFATOOL_DTREE_ENABLED=0` has beenset, it will go straight to least-squares regression.