Our Use Cases present the results of xtractis modeling in many application sectors and include complete Benchmarks against Neural Networks, Boosted Trees, Random Forests and Logistic Regression when relevant.

Click on the Use Case header to go to the article and download the document.

These studies illustrate the ability of xtractis to automatically induce knowledge in the form of predictive and intelligible mathematical relationships, from real-world data (public data or authorized private data).

genetic.jpg

Genetic Diagnosis of Prostate Cancer

How to make an automated —yet totally transparent— medical diagnosis of prostate cancer from genetic sequencing of prostate tissue?

INTELLIGIBILITY OF THE DECISION SYSTEM:

ALGO SCORE STRUCTURE
XTRACTIS + + + 4 unchained gradual rules, each rule using some of the 7 variables automatically identified as predictors. Few rules triggered at a time.
LoR - 120 predictors, 1 linear equation with 120 coefficients.
RANDOM FOREST - 19 predictors, 15 trees, 50 binary rules. Lots of predictors and binary rules.
BOOSTED TREES - - 24 predictors, 14 chained trees, 48 binary rules. Tree #N corrects the error of the N-1 previous trees.
NEURAL NETWORK - - - 12,600 predictors, 1 hidden layer, 13 hidden nodes. Unintelligible synthetic variables.

PERFORMANCE OF THE DECISION SYSTEM:

F1-Score on the External Test dataset

Foetus-e1664872168655.png

Cardiotocographic Identification of Fetal Heart Pathologies

How to make an automated —yet totally transparent— medical diagnosis of fetal heart disease from signal characteristics of fetal heart rate and uterine contractions?

INTELLIGIBILITY OF THE DECISION SYSTEM:

ALGO SCORE STRUCTURE
XTRACTIS + + 56 unchained gradual rules, each rule using some of the 18 variables automatically identified as predictors. Few rules triggered at a time.
LoR + + 20 predictors, 10 linear equations and 86 coefficients
RANDOM FOREST - - - 21 predictors, 470 trees, 26,435 binary rules. Lots of predictors and binary rules.
BOOSTED TREES - - 20 predictors, 750 chained trees, 15,932 binary rules. Tree #N corrects the error of the N-1 previous trees.
NEURAL NETWORK - - - 21 predictors, 3 hidden layers, 90 hidden nodes. Unintelligible synthetic variables.

PERFORMANCE OF THE DECISION SYSTEM:

Average F1-Score on the External Test dataset