Creating reusable features to reduce feature inconsistencies and inference latency
One of the challenges data scientists face is the long data processing time – hours and sometimes days – necessary for preparing features to be used for ML training. Additionally, the data processing steps applied in feature engineering need to be applied to the inference requests during prediction time, which increases the inference latency. Each data science team will need to spend this data processing time even when they use the same raw data for different models. In this section, we will discuss best practices to address these challenges by using Amazon SageMaker Feature Store.
For use cases that require low latency features for inference, an online feature store should be configured, and it's generally recommended to enable both the online and offline feature store. A feature store enabled with both online and offline stores allows you to reuse the same feature values for the...