Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

resistor question

Status
Not open for further replies.

whooshdoo

New Member
I have a 1/4 watt resistor rated at 1k ohms in a circuit and it is showing 12 volts at one end and only .4 volts at the other end. Since that is such a drastic difference in voltage levels, I am wondering if that is normal or could the resistor somehow have too much resistance? Is there a way to tell roughly what the difference in voltage should be across a resistor based on its rating? Thanks very much for any help.
 
How are you measuring it? The resistance shouldn't change the voltage. Only the current should change. Well, if you have two or more resistors, than each one has a voltage drop, but if you're testing just one resistor it should show the full voltage. It sounds to me like you're measuring it with the common probe on (-), touching the positive probe to the (+) side of the resistor (which gives you 12 volts), but then moving the positive probe to the other side of the resistor (giving you .4 volts). To find the voltage drop of a resistor, you should put one probe on one side of the resistor, and the other on the opposite side of the resistor. What you describe sounds about right--your resistor should be OK.
Der Strom
 
12 volts at one end and 0.4 volts at the other means a voltage drop across the resistor of 11.6 volts. The resistor is 1000 ohms, and Ohm's Law is V = I * R. If you plug in the values you will find the current. The power consumed is P = V * I, or P = I^2 * R, or P = V^2 / R. To find the maximum voltage drop allowed you use the resistor's maximum allowed power rating (1/4W), the resistor value, and the appropriate power equation just stated.
 
As an example: (11.6)^2/1000 = 135mW. That's safely under the 1/4 watt rating.
 
Last edited:
Easy. Use P=V^2/R and rearrange:

V=SQRT(P*R) = sqrt(1000*.25) = sqrt(250) = 15.8V.

But you still must admit that writing "The resistor can handle 15.8 volts or 0.25 watts" is kinda meaningless, no? A 1K resistor is in no way limited to 15.8 volts. (It is limited to 1/4 watt, though.)
 
But you still must admit that writing "The resistor can handle 15.8 volts or 0.25 watts" is kinda meaningless, no? A 1K resistor is in no way limited to 15.8 volts. (It is limited to 1/4 watt, though.)

Limiting to 15.8 volts is the same as limiting to 1/4 watt for 1k resistor, per the math involved. So no, it's not meaningless.
 
Quarter-watt resistor. Resistors are rated by how much power they can handle (in addition to their resistance value). You know: P = E * I. Small resistors are 1/8-1/10 to 1/4 watt.

No, resistors have no polarity.
 
so a resistor can't decrease voltage but current? Some people posted here that "you dont need to decrease current in a circuit bcause the circuit will take the current it needed!" So why is there a current limiting resistor? ..for example i have a circuit that needs 5V and 300ma current, what will happen if i applied 5V and 10A current?
 
You asked a bunch of questions there. I'll just answer the one about taking current.

Most circuits and devices do what you described: they only take the current they need, so there's no need for current-limiting devices (like resistors: there are other ways to limit current). Think about your car: you have a huge source of current there, the battery, and a little bitty bulb, say in your glovebox. The bulb (not a LED--see below) will only draw what it needs. The battery can't "push" more current through the bulb than what it draws.

But some devices (most notoriously, perhaps, LEDs) will "eat" excessive amounts of current if you let them, thus letting out the magic smoke. So with these kinds of loads, you need to limit the current they can draw. For LEDs, the simplest way to do this is to put a resistor in series.

It depends on the device.

Resistors can also be used as voltage dividers, "dropping" voltage to a part of a circuit.
 
Regular light bulbs (like what you find in the glovebox of your car) has an internal resistance (the filament). The higher the resistance, the less current is allowed to flow. However, LEDs have a very very small internal resistance, which means a lot of current is allowed to flow. This excessive current is what fries it. That is why an external resistor is often used. It makes up for the low internal resistance and makes sure that the right amount of current flows through the LED.
 
Last edited:
Hi,

From your description it sounds like you are measuring one end to the ground.
Consider this a resistor:

A ---====---- B

and Gnd is (-) of the circuit

it sounds like you are measuring A-Gnd and B-Gnd. The true voltage drop across the resistor is the voltage difference between A and B.

A resistors is a basic component that can be junctioned with another component, say another resistor, in either parallel (where both leads of the resistors are connected) or in series (only one lead is connected to the other's with no connection from another component inbetween).

If we assume you make only a single line connecting the resistors in series, you will get voltage drop, current stays the same and follows Ohm's law and the resistors' values are constant. while in parallel, you will get 2 resistors making up 2 branches, in this case voltage stays the same on both but the current will divide.

The division of voltage follows "Voltage divider rule" and current follows "Current divider rule" (check wikipedia)

Now when the current passes through the resistor while it has voltage drop on it, there appears the term "power". Power is the amount of work/energy dissipated by the device (resistor) and it simply equals [Voltage drop on the resistor X That current passing through resistor] and it appears as heat in our physical world.

Here comes our decision, what is the most proper power rating to choose for that particular resistor to work well without getting burned and overloaded so fast ? commercially, we have 0.25, 0.5, 1, 2, 5...etc watts...pick one.

In real world, many things behave like a resistor, so we use modeling to represent them.
 
Last edited:
Hmm, that sounded awfully familiar.... :p

It sounds to me like you're measuring it with the common probe on (-), touching the positive probe to the (+) side of the resistor (which gives you 12 volts), but then moving the positive probe to the other side of the resistor (giving you .4 volts). To find the voltage drop of a resistor, you should put one probe on one side of the resistor, and the other on the opposite side of the resistor.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top