Feature Engineering
Machine learning algorithms can use different representations of the input features. As we've mentioned in the introduction, the goal of feature engineering is to produce new features that can help us in the machine learning process. Some representations or augmentations of features can boost performance.
We can distinguish between hand-crafted and automated feature extraction, where hand-crafted means that we look through the data and try to come up with representations that could be useful, or we can use a set of features that have been established from the work of researchers and practitioners before. An example of a set of established features is Catch22, which includes 22 features and simple summary statistics extracted from phase-dependant intervals. The Catch22 set is a subset of the Highly Comparative Time-Series Analysis (HCTSA) toolbox, another set of features.
Another distinction is between interpretable and non-interpretable features...