How to prevent the aliasing effect in simulation?

I have a dynamic model of a power electronics converter in the Xcos. I have been using the TOWS_c blocks for recording of the state variables.

My problem is that these blocks need sampling time input which has to be small enough to be able to record the switching noise and on the other hand big enough not to cause buffers overflow.

I have attempted to set the buffers based on the simulation paramters as t_final/t_sampling. Unfortunately also with this approach the buffers are somehow corrupted because I can see only short time interval before the t_final. This occurs in a situation when the t_sampling is suitable for the system dynamics (tens of microseconds). In case I increase the t_sampling I can see whole time interval (hundreds of milliseconds) but it’s corrupted due to the aliasing effect.

Is there any recommended approach for this situation?

Hello,

I presume that you mean subsampling instead of aliasing, right ? There is no “corruption” of output data. Anyway, you should be able to see the whole output by taking a sufficiently large buffer length (this length can be way larger than needed).

S.