It is generally difficult for a multimeter to detect noise or spike signals in an AC signal, as the multimeter detects the value at a specific time point. When converting digital to analogue, the sampling point may not capture the peak of the noise, so the test result may not reflect the presence of noise in the waveform signal. In contrast, an oscilloscope captures a continuous waveform and can capture MAX/MIN values through oversampling, thus displaying the noise signal.
Why the measured output voltage of frequency converter is inaccurate?