Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Issues with ML Pattern Recognition After Bandpass Filtering

TarekSY

New Member
Hello everyone,

We've been working on a machine learning project for pattern recognition, using time-domain features such as kurtosis, mean, standard deviation, variance, skewness, and peak-to-peak values.

Background:

Initially, we trained our data after applying a high-pass filter at 1 kHz. The results were satisfactory.
Upon performing a spectral analysis last week, we discovered that our region of interest lay between 1 kHz and 3 kHz.
Issue:
When testing our pattern recognition system this week, the model's performance deteriorated significantly. Analyzing the data revealed a strong signal component at 8 kHz.

Steps Taken:

We decided to apply a bandpass filter between 1 kHz and 3 kHz to focus on our identified region of interest, expecting our time-domain features to be more relevant.
We trained a new model using the bandpass-filtered data.
However, the model's performance in recognizing patterns was not up to par.
As an additional experiment:

We applied the 1 kHz to 3 kHz bandpass filter on the dataset originally trained with 1 kHz high-pass filtering.
Yet again, we faced recognition performance issues.
We're somewhat puzzled as to why our ML system is underperforming after these filtering operations. Any insights or suggestions would be highly appreciated.
 

Latest threads

New Articles From Microcontroller Tips

Back
Top