10. Compute Feature Table
In this tutorial, you will learn how to generate a Feature Table from the feature lists created in previous tutorials. Once the tables are generated, you will download them and train a LightGBM model to compare their accuracy.
Step 1: Compute Feature Table¶
Navigate to the Feature List Catalog in the 'Experiment section of the menu.
For each feature list, follow these steps:
-
Click .
-
Select the Observation Table
Pre_Purchase_Customer_Activity_next_week_2023_10K
. -
Confirm the computation by clicking .
Step 2: Review Feature Table¶
- Navigate to the Feature Table Catalog in the 'Experiment section of the menu.
- Click on the
Manual List - Pre_Purchase_Customer_Activity_next_week_2023_10K
table. - Open the 'Preview' tab to examine the table and check the values of individual features.
- Review a dictionary-type feature.
- Review a embedding-type feature.
Step 3: Download Feature Tables¶
Navigate back to the Feature Table Catalog in the 'Experiment section of the menu.
For each table, follow these steps:
-
Click on the table to open its details.
-
Go to the 'About' tab and click to download the table.
Step 4: Train LightGBM¶
To evaluate the performance of your feature lists, download the modeling pipeline for LightGBM along with the accompanying notebook that you can download here. Use the notebook to test the accuracy of each feature list and identify the best-performing one.
In our example, one of the feature lists with the highest AUC score is the one composed of the top feature from each theme. This result supports the following insights:
- Feature organization by source and signal type is crucial for creating effective feature sets.
- A small number of advanced features can deliver strong predictive performance in certain scenarios.