Skip to content

10. Compute Feature Table

In this tutorial, you will learn how to generate a Feature Table from the feature lists created in previous tutorials. Once the tables are generated, you will download them and train a LightGBM model to compare their accuracy.

Step 1: Compute Feature Table

Navigate to the Feature List Catalog in the 'Experiment section of the menu. Name

For each feature list, follow these steps:

  1. Click Compute Icon. Name

  2. Select the Observation Table Pre_Purchase_Customer_Activity_next_week_2023_10K. Name

  3. Confirm the computation by clicking Compute Button. Name

Step 2: Review Feature Table

  1. Navigate to the Feature Table Catalog in the 'Experiment section of the menu. Name
  2. Click on the Manual List - Pre_Purchase_Customer_Activity_next_week_2023_10K table. Name
  3. Open the 'Preview' tab to examine the table and check the values of individual features. Name
  4. Review a dictionary-type feature. Name
  5. Review a embedding-type feature. Name

Step 3: Download Feature Tables

Navigate back to the Feature Table Catalog in the 'Experiment section of the menu.

For each table, follow these steps:

  1. Click on the table to open its details.

  2. Go to the 'About' tab and click table download to download the table. Name


Step 4: Train LightGBM

To evaluate the performance of your feature lists, download the modeling pipeline for LightGBM along with the accompanying notebook that you can download here. Use the notebook to test the accuracy of each feature list and identify the best-performing one.

In our example, one of the feature lists with the highest AUC score is the one composed of the top feature from each theme. This result supports the following insights:

  • Feature organization by source and signal type is crucial for creating effective feature sets.
  • A small number of advanced features can deliver strong predictive performance in certain scenarios.

Name