I'm analyzing communication channel for the subsequent modeling and
simulation. My background is mostly datacom, practically no experience
in signal processing, yet it appears that some signal processing
technique may solve the problem.
The problem is that there is some random noise on a channel, or if you
wish, errors arrive at some random time (interarrival times are
checked for randomness and seems like they are sufficiently random).
However every once in a while, the same channel also experiencing
bursty errors. During those bursty errors, error rate may start
decreasing fairly slowly, then reaches some peak and either slowly
fades away or abruptly terminates.
I'm looking for some way to separate bursty errors from random noise,
with some degree of confidence. Some kind of adaptive filter, sort of.
Would appreciate any suggestion.
The data are represented as a stream of bits 0 (bit got transmitted
uncorrupted) and 1 (bit got flipped during transmission).
'Interarrival time' is a length of 0-string.