Jump to content

Why does output voltage increase when pwm input frequency to output enable increase?


Recommended Posts

I'm trying to pulse-width modulate an output from a 74HC585 shift register.

To achieve this, I plugged a PWM output from a teensy 3.1 microcontroler into the output enable input of the 74HC595. The A output of the 74HC595 is then connected to the ground via a 640 Ohm resistor. Voltage between VCC and ground is 3.3V.

Then I use the microcontroler to shift 0b00000001 to the 74HC595 and I start to PWM the output enable input at 50% duty cycle.

I expect to measure about 1.6V voltage between output A of the 74HC595 and ground. And indeed, if the PWM carrier frequency is slow (100 Hz), that's what I observe.

 

Here is the 74hc595 datesheet

Problem is, when I try to increase the PWM carrier frequency, the voltage between A and ground increase. For example, I measure 2.7V for a 10 KHz frequency. I measured the voltage between the teensy pwm output and ground, and it is as expected : 1.6 V.

So, I know ICs can't be fed arbitrary high frequencies, but I was under the impression that 10 KHz doesn't qualify as high frequency.

I seem to be unable to understand the problem, so here I am : can anyone explain me the reasons of this behavior ?

 

Share this post


Link to post
Share on other sites

What are you using to measure the output voltage? A cheap multimeter which doesn't average the voltage properly may give unpredictable readings. Remember that your output will be a square wave.

 

You could add a filter to your output to get a smoothed DC signal which you can measure reliably.

Edited by ARandomOWL

Share this post


Link to post
Share on other sites

You need an oscilloscope to view stuff like this. A regular multimeter is going to do some amount of averaging and any signal variance will throw them off.

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...