(note: I'm pretty inexperienced in SP)
So I have a signal that is square-like in time, and not very noisy - I have successfully smoothed it using methods like free-knot spline, and piecewise savitsky-golay. My issue is not with determining the underlying signal as it is obvious: rather, it is with reducing the amount of data so that calculations on it are quicker. However, since the data was recorded at a constant timestep, the square-like nature of the curve makes uniform decimation a bad idea: there are very few data points in the information-rich (steep) portions, while there are a lot of data points in the information-poor (flat) regions. Uniform decimation results in almost a complete loss of data along the steep portions of the graph.
My ideal result is something like example 2 here, where they use stiff ODE solvers in MATLAB:
https://www.mathworks.com/help/matlab/ref/ode23.html
For other reasons, I can't use these stiff ODE solvers - but that result is what I would like. It uses an adaptive timestep quadrature algorithm to compute the location of the next data point.
What are my options here? I have tried to research this a lot on my own, but my lack of SP knowledge makes it difficult to know if a potential solution is even applicable to my problem (I read a little about adaptive filters, for example, but much of it is over my head).
Thank you in advance!