Basic Workflow#

Objectives: what you will take away#

  • Definitions & an understanding of terminology unique to Howso Engine and what the basic workflow of using the Howso Engine looks like.

  • How-To import data, map features, train, analyze, and make inferences using the Howso Engine.

Prerequisites: before you begin#

Installation

Data#

Our example dataset for this recipe is the well known Adult dataset. It is accessible via the pmlb package installed earlier. We use the fetch_data() function to retrieve the dataset in Step 1 below.

Concepts & Terminology#

Howso Engine is a generalized Machine Learning (ML) and Artificial Intelligence platform that creates powerful decision-making models that are fully explainable, auditable, and editable. Howso Engine uses Instance-Based Machine Learning which stores instances, i.e., data points, in memory and makes predictions about new instances given their relationship to existing instances. This technology harnesses a fast spatial query system and information theory for performance and accuracy.

Notebook Recipe#

The following recipe will demonstrate some of the capabilities demonstrated in this guide as well as a few additional capabilities.

How-To Guide#

Here we will walk through the steps of what a basic workflow might look like when using Howso Engine. First, we will load data into a pandas DataFrame for use with Howso Engine. Then, we will use the Howso Engine to map attributes of the features, train a trainee, analyze, and react.

[1]:
import pandas as pd
from pmlb import fetch_data

from howso.engine import Trainee
from howso.utilities import infer_feature_attributes

Step 1 - Load Data and Infer Feature Attributes#

First, we load the adult dataset from the PMLB repository. This dataset consists of 15 features, which will have their attributes inferred by infer_feature_attributes(). This will determine attributes about each feature including bounds, allowed values, and feature type. Before the following steps, the inferred feature attributes should be inspected to ensure their correctness.

[2]:
df = fetch_data('adult').sample(1_000)
features = infer_feature_attributes(df)

features.to_dataframe()
[2]:
type decimal_places bounds data_type original_type
min max allow_null observed_min observed_max data_type size
age continuous 0 0.0 137.0 True 17.0 90.0 number numeric 8
workclass nominal 0 NaN NaN False NaN NaN number integer 8
fnlwgt continuous 0 0.0 1314636.0 True 22743.0 806316.0 number numeric 8
education nominal 0 NaN NaN False NaN NaN number integer 8
education-num continuous 0 0.0 26.0 True 1.0 16.0 number numeric 8
marital-status nominal 0 NaN NaN False NaN NaN number integer 8
occupation nominal 0 NaN NaN False NaN NaN number integer 8
relationship nominal 0 NaN NaN False NaN NaN number integer 8
race nominal 0 NaN NaN False NaN NaN number integer 8
sex nominal 0 NaN NaN False NaN NaN number integer 8
capital-gain continuous 0 0.0 164870.0 True 0.0 99999.0 number numeric 8
capital-loss continuous 0 0.0 4219.0 True 0.0 2559.0 number numeric 8
hours-per-week continuous 0 0.0 161.0 True 3.0 99.0 number numeric 8
native-country nominal 0 NaN NaN False NaN NaN number integer 8
target nominal 0 NaN NaN False NaN NaN number integer 8

Step 2 - Create the Trainee, Train, and Analyze#

A Trainee is similar in function to a model in other machine learning paradigms, but is not locked to any particular use-case or to predicting a particular feature. Note that both the data and the feature attributes are supplied at this time. Since the feature attributes are essentially a part of the training data it is extremely important to ensure they are correct.

[3]:
trainee = Trainee(features=features)
trainee.train(df)

After the data are trained, we can Trainee.analyze() the Trainee. This method will determine the best hyperparameters for the data and cache some important values that are used to ensure the highest model performance. By default, Trainee.analyze() will optimize the Trainee’s parameters for any possible target feature.

[4]:
trainee.analyze()

Step 3 - React#

Now that the Trainee has been prepared, it is ready for use. A common use-case, determining how well a model performs when predicting the dataset, can be done with a single call to the Trainee:

[5]:
prediction_stats = trainee.get_prediction_stats(
    action_feature="target",
    details={"prediction_stats": True},
)
prediction_stats
[5]:
fnlwgt hours-per-week age capital-gain capital-loss education-num marital-status education relationship native-country occupation workclass race target sex
r2 -0.043161 0.155911 0.382566 0.033839 -0.035253 0.894863 NaN NaN NaN NaN NaN NaN NaN NaN NaN
adjusted_smape 43.837959 20.722718 20.882465 110.816386 89.998968 2.437472 NaN NaN NaN NaN NaN NaN NaN NaN NaN
spearman_coeff 0.038152 0.414999 0.605424 0.542163 0.561853 0.946770 NaN NaN NaN NaN NaN NaN NaN NaN NaN
mae 77682.468601 7.773918 8.321691 1694.000332 148.576443 0.184660 0.240613 0.001 0.327466 0.160614 0.768767 0.352813 0.224644 0.202272 0.255334
smape 43.838101 21.037388 21.169289 120.162101 90.637009 2.637091 NaN NaN NaN NaN NaN NaN NaN NaN NaN
mcc NaN NaN NaN NaN NaN NaN 0.747194 1.000 0.660176 0.255553 0.192149 0.412829 0.225002 0.554646 0.434896
precision NaN NaN NaN NaN NaN NaN 0.512916 1.000 0.587408 0.080919 0.222581 0.381527 0.531574 0.797499 0.725968
rmse 104807.663370 11.175091 10.912021 7953.825872 411.722624 0.822677 NaN NaN NaN NaN NaN NaN NaN NaN NaN
recall NaN NaN NaN NaN NaN NaN 0.483188 1.000 0.585990 0.080037 0.210193 0.330203 0.260290 0.758515 0.709249
accuracy NaN NaN NaN NaN NaN NaN 0.829830 1.000 0.753000 0.906883 0.274274 0.760761 0.856000 0.845000 0.757000
missing_value_accuracy NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN

An action_feature is the same as a target feature or dependent variable. This call will compute a number of different statistics, including accuracy, precision, recall, \(R^2\), and others. Rather than performing a train-test split, which is common with other machine learning techniques, the Trainee uses leave-one-out to provide a more comprehensive understanding of the data. More traditional approaches can still be used with the Trainee.react() method.

API References#