NerdKits - electronics education for a digital generation

You are not logged in. [log in]

NEW: Learning electronics? Ask your questions on the new Electronics Questions & Answers site hosted by CircuitLab.

Basic Electronics » Voltage Drop when LEDs Flash

January 13, 2012
by Rocket_Man_Jeff
Rocket_Man_Jeff's Avatar

Hi all,

So I have a simple circuit that's flashing LEDs one at a time. I've also set up a voltage monitor on the power supply (wall wart) using the ADC and a voltage divider circuit. This all works quite well and the voltage readings agree with my multimeter (getting between 12 and 13v). I've noticed that when the LED turns on, the voltage drops by roughly 0.5v. I stuck a relay into the circuit (driven by an NPN transistor), and when engaged it dropped a normally 12v power supply down to 7v. Since the relay is controlling a load that needs 12v, this is a problem. Using the 5v regulator the MC still gets the necessary voltage, but I'm curious why this drop is occurring, and then how to fix it.

I've read on here that using capacitors might work, but is there a background issue that should be resolved first?

Thanks! Jeff

January 13, 2012
by sask55
sask55's Avatar

Jeff; What is the output amperage rating of your wall wart, this will usually by expressed as mille amp on the wall wart label. In many case the circuitry of a wall wart type power supply is designed to produce the stated voltage at a specific load. It appears that your power supply may be too small to supply the relay coil itself let along any other load that the relay is intended to control. You can use a VOM meter to directly measure the current load of your relay coil or measure the resistance of the coil and use ohms law to calculate the current that a 12 volt supply would produce. You will require power supply large enough to supply the Nerdkit circuit, the relay coil and the load that you are trying to switch.

Darryl

January 13, 2012
by Rocket_Man_Jeff
Rocket_Man_Jeff's Avatar

Thanks for the advice Darryl. The wall wart I'm using specifies 9V at 300mA. The relay I'm using has a coil resistance of 56ohm. With a 5v coil voltage, that should mean the circuit pulls 89mA (right?). Even with that being relatively low, I like your suggestion of measuring the current of the whole device. As I just discovered the fuse in my multimeter is blown, I'll get back to you on that one! Relays aside and just considering the flashing LED - the one that came with my Nerdkit - is it normal to see a voltage drop like that given the power supply I am using?

Thanks again! Jeff

January 13, 2012
by sask55
sask55's Avatar

I am in no way and authority on this. Other people on this forum are much better at explanations then I am likely to be , perhaps someone will help me out.

It appears to me that your power supply is not going to be adequate to supply much current at all at 12 volts. In fact it is surprising that you are able to measure between 12 and 13 volts on a supply that specifies 9 volts. There will be a number of factors that would influence the voltage drop from the wall wart as you draw current from it. Internal resistance, size and number of transformer coils, rectifier properties would all play a part. The manufacture of the unit expects that the voltage output from the wall wart will have dropped to around 9 volts when the unit is producing 300mA. That 3 or 4 volt drop with a relatively small current demand is an indication that the unit is very small. If you are intending on supplying 12 volts to the load that the relay is controlling you will require a power supply that is rated large enough to handle the load. Also you will need to consider the specifications of the relay coil itself. A lot of relays are designed to operate with a 12V or 24 volts coil energizer voltage and would not likely close using 5 volts.

I don’t know how familiar you are with measuring current with your multimeter. keep in mind that when your meter is set to measure current it has very low impedance, that is to say that it will not restrict the flow of current much at all. It is very easy to blow the fuse or worse then that the meter itself could be damaged when attempting to measure current. Always start with the highest current setting you have and work your way down. The meter must be in series with a load or the current thru the meter will be whatever the power supply can produce, thats not good at all.

I hope this help you out a bit. Darryl

January 14, 2012
by BobaMosfet
BobaMosfet's Avatar

Rocket-man-Jeff-

It appears that you are drawing too much current. A couple things first, however- Your Wall-wart should provide the rated voltage WITH A LOAD on it. Or close. It may provide higher voltage with no load, because well, it's rated to have a load, not exceeding its power output. 300mA * 9V = 2.7W Max. Not much.

How long are you triggering the relay for? How long is the .5V drop for? A relay, in a DC circuit, has an inductor, and is effectively a short during the period it is in operation. The Ohmic rating (56-Ohm) is not Resistance (R), it's Reactance (X-subL), so that rating has no bearing here (it would if this were an AC circuit).

When you trigger your relay, for as long as it's triggered, it's a short across the rails. Because it's an inductor, and it will release power (back EMF) backwards into the circuit when you release it, you need a diode parallel across the inductor. Without one, you risk a reverse voltage spike, that under certain circumstances might exceed the PRV of your NPN and ruin it (or other things that get hit with the spike).

Since you are using a relay to drop the entire potential of your power-supply across your load, how much current is the load drawing? Everything needs a certain amount of current and voltage off the power-supply. If this exceeds 300mA, you can see how voltage would sag.

See, how much there is to consider? :D

BM

January 23, 2012
by Rocket_Man_Jeff
Rocket_Man_Jeff's Avatar

Thanks for all the input. I obtained a larger power supply this past weekend (15V @ 5A) and the voltage drop issue seems to have gone away. Now it seems like my circuit has a new problem. Both relays I use engage correctly with no measurable drop in voltage, however the MCU restarts when the relays disengage. I'm learning how complicated these seemingly simple circuits can be, so let me recap what's going on:

MCU pin triggers an NPN transistor. NPN transistor triggers relay 1 (5v coil, 12v 1A contacts). Relay1 triggers relay 2 (12v coil, 12v 10A contacts). No load on relay 2 at this time.

The relays engage for 10s, and then disengage. I've been using diodes and current limiting resistors as necessary (with interesting results). Basically everything works properly except for the MCU resetting. I'm guessing the voltage is spiking somewhere in circuit. Any thoughts on how to fix this?

FYI the purpose of this project is to control a 12v 8A independent circuit.

Jeff

January 23, 2012
by mongo
mongo's Avatar

If things go nuts when a relay coil De-energises, you need to add a clamping diode across the coil terminals. This will absorb the back EMF from the coil. To do this, add the diode so that the anode (the side that has the arrow shape) is on the NEGATIVE terminal of the relay. A 1n4001 or 1n4007 will do nicely.

It is best to do this right at the relay, rather than in the circuit somewhere because the leads to the relay can radiate EMF noise between the relay and diode.

January 23, 2012
by Rocket_Man_Jeff
Rocket_Man_Jeff's Avatar

Yeah, I had a diode in parallel with the relay coils, except I had installed them going the wrong way. Everything seems to be working now!

Thanks, Jeff

Post a Reply

Please log in to post a reply.

Did you know that you can input numbers in binary via a DIP switch, and output them to the LCD? Learn more...