Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

decreasing current from 3A to 20mA

Status
Not open for further replies.

Shohadawy

New Member
I know it's a very stupid question but I'm a newbie. I have a current supplier that has an output from 0 to 3A. I need an analog signal from 4 to 20mA to use it in my PLC Analog Module tests.

Please supply me with a schematic of a suitable circuit for this purpose.
 
...I have a current supplier that has an output from 0 to 3A.
I'm guessing that you are describing an adjustable, regulated, laboratory power supply. If so what is it's voltage range (current is not important)?

I need an analog signal from 4 to 20mA to use it in my PLC Analog Module tests.

Please supply me with a schematic of a suitable circuit for this purpose.

If your power supply is an adjustable constant-voltage supply, then simply put a 1000Ω in series with the + output terminal of the supply. The other end resistor connects to the + input of your 4-20mA device. The - input of your device connects to the supply's - terminal.

To produce currents ranging from 4mA to 20mA, adjust the output voltage of the supply from 4V to 20V. By Ohms Law, the current flowing through your device is Vsupply/(1000+Rin), where Rin is the input resistance of your device. I am guessing that Rin << 1000.
 
Last edited:
I do not know if you are properly saying what you are trying to say. THere is a common misunderstanding about power supplies newbies run into that is not an actual problem, but other parts of your post seem to describe an another type of actual problem. It is not clear to me which one you are talking about.

Are you talking about a current source or a voltage source? Do you actually need a fixed output current (not the same as current limiting) to produce a signal? Or are you just trying to power the PLC and it consumes between 4-20mA?

You can either choose the current or the voltage to be a fixed level and the other will vary in order to achieve this. You cannot choose arbitrarily choose both to be of any value for any load.

Voltage sources will force a voltage through the circuit...the circuit will draw however much current it needs to. THe current rating listed for the voltage source is the CAPACITY. As long as the current drawn is less than the current the voltage source is rated for, you will be fine. It will not try and force that amount of current into the circuit...THis is what you use to power almost something. It does not matter if you use a 1000A voltage source to power a device that draws 1mA because the 1000A is a capacity, not a current value that will be forced into the circuit.

A current source WILL force a certain amount of current into the circuit, it will force the a certain amount of current in the circuit, and the voltage will be increased to whatever it needs to be to force that current in. Of course....the voltage can only get so high and that limits how low a resistance you can connect to the current source and still produce a correct current output. This maximum voltage capacity is similar to the maximum current capacity- as long as the load requires less than the rated value, it will give a proper output. These have much more specialized uses.
 
Last edited:
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top