Study Finds ClassBD Outperforms Top Fault Diagnosis Methods in Noisy Scenarios | HackerNoon
Briefly

In our computational experiments, we injected additive Gaussian white noise (AWGN) into datasets to simulate challenging conditions, facilitating a rigorous assessment of our blind deconvolution approach.
The level of noise during experiments was finely managed by adjusting the Signal-to-Noise Ratio (SNR), which critically influenced our models' performance under severe degradation conditions.
Our methodologies also involved sophisticated preprocessing, such as segmenting raw signals to create distinct training and test sets, which is essential for maintaining model integrity.
By utilizing Z-score standardization on noisy signals, we ensured that the data fed into our classifiers remain on a similar scale, facilitating better training outcomes.
Read at Hackernoon
[
|
]