Issues with ML Pattern Recognition After Bandpass Filtering

Status
Not open for further replies.

TarekSY

Newbie
Joined
Aug 8, 2023
Messages
1
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
17
Hello everyone,

We've been working on a machine learning project for pattern recognition, using time-domain features such as kurtosis, mean, standard deviation, variance, skewness, and peak-to-peak values.

Background:

Initially, we trained our data after applying a high-pass filter at 1 kHz. The results were satisfactory.
Upon performing a spectral analysis last week, we discovered that our region of interest lay between 1 kHz and 3 kHz.
Issue:
When testing our pattern recognition system this week, the model's performance deteriorated significantly. Analyzing the data revealed a strong signal component at 8 kHz.

Steps Taken:

We decided to apply a bandpass filter between 1 kHz and 3 kHz to focus on our identified region of interest, expecting our time-domain features to be more relevant.
We trained a new model using the bandpass-filtered data.
However, the model's performance in recognizing patterns was not up to par.
As an additional experiment:

We applied the 1 kHz to 3 kHz bandpass filter on the dataset originally trained with 1 kHz high-pass filtering.
Yet again, we faced recognition performance issues.
We're somewhat puzzled as to why our ML system is underperforming after these filtering operations. Any insights or suggestions would be highly appreciated.
 

You are providing absolutely no information concerning the dataset, sample domain, bandwidth, nor the pattern to be recognized.
Are you serious you're expecting some useful reply? BTW, are you applying the filtering on input dataset and the pattern to be recognized aswell ?
Think of detailing the exact process flow for each experiment
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…