Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

CD4017 help.

Status
Not open for further replies.
Hello Axro,
Look at the graph I posted of the typical output current from a CD4xxx.
Its max current (into a dead short) is 19mA when its temperature is 25 degrees C and its supply is 10V.
The graph shows the voltage dropped across the output transistor in the chip with load currents. With a 150 ohm load then the voltage dropped across the output transistor is 7.3V then the voltage across the 150 ohm resistor is the remainder which is 2.7V. Ohm's Law calculates the current to be 18mA, not 48.7mA and not 66.7mA.

Ok, I guess I don't see where you are getting the 7.3V from?
 
Ok, I guess I don't see where you are getting the 7.3V from?
The CD4xxx is made with little Cmos transistors. They are not very powerful so they have resistance. Their resistance limits the current into a load. When they limit the current then a voltage drop occurs across them.

The top of the graph is labelled "Drain-to-Source Voltage" that is the voltage dropped across the output transistor.
 
The CD4xxx is made with little Cmos transistors. They are not very powerful so they have resistance. Their resistance limits the current into a load. When they limit the current then a voltage drop occurs across them.

The top of the graph is labelled "Drain-to-Source Voltage" that is the voltage dropped across the output transistor.

Ok It's making more sense now. But How did you get that exact number of 7.3? I understand how with the LED because you used the LED's Voltage Drop, but I'm still confused on the resistor.
 
I made a guess then calculated the current and looked at the graph to see how far off I was. Then I made a few more guesses that became closer to the actual values.

Ohm's Law says that 2.7V across 150 ohms= 18mA. The graph shows that with a current of 18mA then the 150 ohm resistor will have 2.7V across it.
 
I made a guess then calculated the current and looked at the graph to see how far off I was. Then I made a few more guesses that became closer to the actual values.

Ohm's Law says that 2.7V across 150 ohms= 18mA. The graph shows that with a current of 18mA then the 150 ohm resistor will have 2.7V across it.

So the voltage drop is not coming from the resistor but from the 4017 itself? And if I put a 150ohm resistor on an output, I would be at 18mA at 2.7V output?
 
Last edited:
So the voltage drop is not coming from the resistor but from the 4017 itself? And if I put a 150ohm resistor on an output, I would be at 18mA at 2.7V output?
All resistances with a current in them produce a voltage drop. The resistor has a 2.7V drop and the mosfet at the output of the CD4017 produces a 7.3V drop when the temperature is 25 degrees C and the supply is 10V.
There is another graph that shows the minimum current of some CD4xxx ICs which is about half the current.
 
So in any circuit that there is a resistor there is voltage dop?
In most circuits there is a resistance (it might be the resistance of a Mosfet). If there is current flowing in a resistance then there is a voltage drop across the resistance.
 
So if all resistances have a voltage drop, why then when you calculate resistance needed for a LED you only take into account the voltage drop of the LED and not the voltage drop of the resistor.

And to be clear, essestially voltage drop is the voltage used by something..correct? In the case of resistors it's lost as heat?
 
Last edited:
So if all resistances have a voltage drop, why then when you calculate resistance needed for a LED you only take into account the voltage drop of the LED and not the voltage drop of the resistor?
An LED works with current, not voltage. It sets its own voltage drop. The resistor is used in series to limit the current.

And to be clear, essestially voltage drop is the voltage used by something..correct? In the case of resistors it's lost as heat?
Yes, the power in a resistor causes it to heat.
The power in a transistor also causes it to heat.
Power is the voltage across it times the current through it.
 
Ok, so say I have 3 resistors in series with a 9V power supply.
4000ohm
3000ohm
1000ohm

The voltage drop across each would be
4.5V
3.375V
1.125V.

So if this is correct. Does that mean the the 4000 ohm resistor only "sees" 4.5V? Is Voltage drop and "Voltage across" the same thing?
 
Ok, so say I have 3 resistors in series with a 9V power supply.
4000ohm
3000ohm
1000ohm

The voltage drop across each would be
4.5V
3.375V
1.125V.

So if this is correct. Does that mean the the 4000 ohm resistor only "sees" 4.5V? Is Voltage drop and "Voltage across" the same thing?
Yes it is.
 
Yes it is.

Ok so if that is correct what about this scenario.

I have the 4000, 3000, 1000 resistors on series with a 9v battery. I add a compenent that needs a minimum of 6V to function(current doesn't matter). This device has little to no resistance.

Will this component function properly? My guess is no if I am understanding everything correctly. Sunce the resistors drop all of the voltage there is none "left" for my mystery component. Is that right, or is there still 9v in the circuit?
 
Your resistors total 8k ohms.
If you add a device in series that operates from 6V but draws a very low current (because it has a very high resistance) then if the supply is 9V the device will have almost 9V across it and might be damaged.

If you add a device in series that operates from 6V but draws a very high current (because it has a very low resistance) then if the supply is 9V the device will have almost zero volts across it then it won't work.

Resistors limit the current, not the voltage unless they are in a voltage divider.
 
If you add a device in series that operates from 6V but draws a very high current (because it has a very low resistance) then if the supply is 9V the device will have almost zero volts across it then it won't work.

Resistors limit the current, not the voltage unless they are in a voltage divider.

In a way though they almost are limiting voltage are they not? Because they are dropping almost all of the 9 V so the 6v component gets none?
 
In a way though they almost are limiting voltage are they not? Because they are dropping almost all of the 9 V so the 6v component gets none?
No.
The resistors limit the current. But the device needs a very high current. Without the high current then the device gets a very low voltage.
It is Ohm's Law.
 
No.
The resistors limit the current. But the device needs a very high current. Without the high current then the device gets a very low voltage.
It is Ohm's Law.

I don't know why I'm having such a hard time wrapping my head around this.

Voltage drop is wasted voltage correct?

So if there are 2 resistors of equal value in a circuit with 9V they both drop 4.5v. They add up to a total of 9v wasted voltage. Essentially all the voltage is "used up" in the circuit. Now you add a capacitor in series with the 2 resistors. How is there any voltage "left" to charge it, if the resistors are dropping it all?
 
A capacitor in series with two resistors will charge up to the supply voltage with the resistors limiting the charging current.

At first the capacitor has no charge so the current will be V/R.
Then as the capacitor charges the voltage across the resistors becomes less which makes the current less.
 
But how is there any voltage left to charge the cap if the resistors are dropping it all?

I guess the way I am thinking about it is the resistors are dropping all 9v(Making 0V left in the circuit). And I = V/R and V is 0 then there should be no current flowing either.

I know I am wrong I just need the right thing to be said to make me see the light.
 
Last edited:
The current charges the capacitor, not the voltage.
When power is first applied to the circuit of a resistance in series with a capacitor then the charging current is the highest because the capacitor has no voltage yet. The resistance has the entire 9V across it.

When the capacitor is half charged then it has 4.5V across it and the resistance has 4.5v across it so the charging current is half of maximum.

EDIT:
The charging current is V/R. At first, the resistance has the entire 9V across it.
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top