Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Converting 5V sinusoidal to 3.3V sinusoidal wave

Status
Not open for further replies.

twinkleBell

New Member
hi...

I need to know of there is any circuitry that allows me to convert a 5V sinusoial wave to its equivalent 3.3V wave. I.E.

0V - 5V ------> 0V - 3.3V

Need to know this urgently cos the DSP that i am using can onyl takes in max. 3.3V for its ADC.

Moreover, is it possible to do the reverse too? From 3.3V to 5V?

I hope to involve min. circuitry as I am running short of time for my Engineering project and this is just one of my countless problems.

Any help will be appreciated!
 

ukeee

New Member
Is it not possible to simply use a potential divider to reduce the sinusoidal wave to a maximum of 3.3V, getting it back to 5V might be a bit more complicated. Here are two ways of doing it of the top of my head. The first would be to use a transformer (you'll probably have to wind your own) the other solution is to use an amplifier an op-amp should do nicely. However it depends on what you are trying to do, for example what kinds of distortion can you tolerate, what is the signal source?

I think it is pretty common to use a voltage divider for input signals to dsps to reduce them to the correct level. If you don't think these ideas will suit then post back with a few more details and i'll have another think about it.
 

twinkleBell

New Member
A potential divider has a problem of possible distortion of my signal, right? actually, this signal will be going into my DSP ADC channel, so I am hoping to get as good a signal as possible.

Thought of using a AC-AC converter, but think it will require quite a fair amt of work and I am running short of time.

other than a conventional potential divider than involves 2 resistors, are there any other ICs that I can include to have a better signal?

My signal source is from a DC-AC converter and then the signal is sensed by a sensing amplifier then a 2.5V analog offset is added to the signal to obtain a 0V to 5V sine wave. this part of the circuitry is fixed and I am supposed to implement my DSP on top of this.

So, I need to scale down the magnitude of the signal as my DSP ADC can only take in max of 3.3V.
 

Russlk

New Member
What makes you think that a potential divider will distort the signal? I assume you did not generate the correct amplitude in the first place because you were not aware of the ADC limitation?
 

twinkleBell

New Member
The setup was initially for another DSP that has a max ADC value of 5V.
But as another kind of DSP was purchased by my school, I have to use the new DSP whose ADC max value is 3.3V.

I thought the accuracy of the signal from a potential divider depends on the accuracy of the resistance value? the ratio of the resistor value to the total resistance will affect my signal, right?

Actually, I have not tried out the potential divider method yet. So will try it out and see if it works. But thanks to all for the advice.
 

Nigel Goodwin

Super Moderator
Most Helpful Member
twinkleBell said:
I thought the accuracy of the signal from a potential divider depends on the accuracy of the resistance value? the ratio of the resistor value to the total resistance will affect my signal, right?

Actually, I have not tried out the potential divider method yet. So will try it out and see if it works. But thanks to all for the advice.

It will work fine, obviously the absolute value will depend on the accuracy of the potential divider (bearing in mind the input and output impedances as well). But it's simple to adjust the values to set it to read as you wish.
 

twinkleBell

New Member
Convert from 3.3V to 5V

My DSP generates a PWM signal of 0V and 3.3V. But the IGBT gate drive that was set-up beforehand needs a PWM signal of 5V to work.

Other than using a boost converter, is there any other methods to convert my signal?
 

Nigel Goodwin

Super Moderator
Most Helpful Member
Re: Convert from 3.3V to 5V

twinkleBell said:
My DSP generates a PWM signal of 0V and 3.3V. But the IGBT gate drive that was set-up beforehand needs a PWM signal of 5V to work.

Other than using a boost converter, is there any other methods to convert my signal?

I've no idea what a 'boost converter' has to do with it, all you need is a small amplifier - any opamp would do, and you could easily set the gain.

Even simpler, you could use a transistor, with a pull-up resistor to the 5V rail - but this would invert the signal - if you can't change the software to compensate, you would need to arrange hardware inversion (another transistor, or something similar).
 

Styx

Active Member
For the digital PWM signal Opto - this will also give you signal isolation to yr gate drive, which you really should do

Opto only a valid method if yr DSP can source 10mA, if it can just put a 240Ohm resistor in series with the opto and DSP. On the isolated side just provide the 5V power to the oputput stage.


That 5V rail should also be isolated (again if you are switching in a bridge fasion)

The isolation very critical once you start gettting some real volts on yr DC link but also if your load is inductive and yr freewheel diodes are slow to turn-on w.r.t turn off of IGBT you could get some nice overshoots - ok off topic and dont know details of the rest of the cct but old habbits sorry
 

twinkleBell

New Member
Re: Convert from 3.3V to 5V

Nigel Goodwin said:
I've no idea what a 'boost converter' has to do with it, all you need is a small amplifier - any opamp would do, and you could easily set the gain.

Even simpler, you could use a transistor, with a pull-up resistor to the 5V rail - but this would invert the signal - if you can't change the software to compensate, you would need to arrange hardware inversion (another transistor, or something similar).

Hmmm... cos a friend of mine suggested to me to use a boost converter to booct up the voltage. think i have been trying to complicate the whole matter.

but thanks a lot for the suggestions. Should have thought of the op-amp solution.
 
Status
Not open for further replies.

Latest threads

EE World Online Articles

Loading
Top