Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Anyone use LTC2414/2418?

Status
Not open for further replies.

Oznog

Active Member
Anyone use LTC2414 or 2418 ADC chip? I can't get this thing to read anything but 0's out of the conversion and I can't figure it out.
 
Is this the 24-bit ADC? I have used them on a couple of boards. On the serial interface, are you clocking them externally or from the internal oscillator?
 
I am clocking externally, so I've got SCK=0 at the fall of CS_n.

I need to do just one conversion per selected channel, then select another channel. I seem to get hung up waiting for EOC_n. I do understand just how slow the 2418 conversions are, but they never happen.

As long and thorough as it appears, I still find Linear's spec very confusing. I assume the way to detect EOC_n is to put a pullup resistor on SDO and wait for it to fall low?
Does the part need to be clocked for the EOC_n to assert, or does it fall on its own?
After CS_n=0, doesn't the part do an automatic conversion on CH0? So I wait for EOC_n so I can clock out this junk and clock in the desired conversion into SDI?
Does the 2418 HAVE to clock in the SDI at the same time as the data clocks out, or can you hold SDI high and use the 1->0 transition to indicate the start of the request?
 
Forget what I said about using a pullup to detect EOC_n. Now I recall that it's only tristated as long as CS_N=1, not with EOC_n.

Is it necessary to do a CS_N=1 between giving the conversion request and awaiting the result?

I think I'm 100% consistent with the spec's example.... looks like sometimes it's reading back 0's, sometimes F's. Neither is valid.
 
I am clocking externally, so I've got SCK=0 at the fall of CS_n.
OK.
As long and thorough as it appears, I still find Linear's spec very confusing. I assume the way to detect EOC_n is to put a pullup resistor on SDO and wait for it to fall low?
Does the part need to be clocked for the EOC_n to assert, or does it fall on its own?
No need to clock SCK to check for EOC_n. However you have to bring CS_n -> 0, for EOC_n to assert it's state (of course while keeping SCK low).
After CS_n=0, doesn't the part do an automatic conversion on CH0? So I wait for EOC_n so I can clock out this junk and clock in the desired conversion into SDI?
Yes.
Does the 2418 HAVE to clock in the SDI at the same time as the data clocks out, or can you hold SDI high and use the 1->0 transition to indicate the start of the request?

I haven't tried this but the datasheet states that holding SDI high should be avoided. The only acceptable values for the first 3 bits shifted in are 101, 100 and 000. You may hold SDI low to maintain the same channel selection.

Finally, SCK should be a clean rapid transitioning signal. It should be less than 0.6V for low and as close to VCC for high. If not, the data does not get shifted out in correctly and you will get wildly erroneous results.
 
Oznog said:
Is it necessary to do a CS_N=1 between giving the conversion request and awaiting the result?

AFAIK, you may hold CS_n=0 after briinging it down low while keeping SCK low. I remember I tested it once but my software normally deasserts CS_n for one clock period after shifting out the result just to "remind" the chip that it is externally clocked. In my case there is no cost to doing this because it has to be done it at least once.
 
Here's my current implementation:
Init:
CS_N=1;SCK=0;SDI=1;
delay;
CS_N=1;

ltcGetAndSet():
while(SDO); //loop until EOC
for(i=0;i<32;i++){
//Request single ended conv of ch7
case(i){
0: SDI=1;
1: SDI=0;
2: SDI=1;
3: SDI=1;
4: SDI=1;
5: SDI=0;
6: SDI=1;
7: SDI=1;
default:SDI=1;
}
//shift in result
result=result<<1;
result+=SDO;
SCK=1;
delay;
SCK=0;
delay;
}

It hangs on the wait for EOC after the first conversion. So I added a CS_N=1 at the end of the ltcGetAndSet routine, and a CS_N=0 at the beginning (SCK is guaranteed to be low at the time). Then I got some results, but it's junk:

It returns one of two things over and over. Most common has the last 16 bits==0xffff, which is not possible since the channel addr, 011, is in those bits. I'd see 1F, 3FFF, etc in the highest 16 bits.
Then there's another answer I'd get that has all the high & low bits as 0.

I applied 0V and 5V to the inputs but it doesn't change the behavior.
 
Oznog said:
Here's my current implementation:
Init:
CS_N=1;SCK=0;SDI=1;
delay;
CS_N=1; <-- Is this correct?
}

Shouldn't the last line above be CS_N=0, to enable reading the status of EOC?

IMHO, you don't have a clean SCK. Can you post the schematics?
 
Yes, that's a typo, the code sets CS_N=0.

SCK=LC7
SDO=LB4
CS_N=LB5

Vdd,VRef+=5V
Gnd, Fo, Vref-=0V

The signal wires are no longer than 1.5". There's a ceramic bypass cap from Vdd-Gnd right next to the chip.

Looking and SDO and SCK on a scope, they look consistent with the readout. There are a number of transistions in the first bits, but SDO always =1 after at most the first 16 bits.
 
Huh. Looking at the pin #defines in my C code, I did use the LB4, the latch value, for SDO. Now I've never been sure I fully understood the difference. I have an output value now.

I stripped the last 6 bits of channel & parity, and get these values:
Applying 5V gives 0xc00000. It's a totally static value with no noise. Since bit 23 is a sign bit indicating 'positive', it's 0x40000, or 262144.

Grounding the input gave 0x801AD1, the exact value changes every sample. Stripping the sign bit gives 0x001ad1, or 6865.

So the value from giving gnd is small and looks like it'd be zero if not for ground noise. But why is the value when applying 5V so far from the 0x7fffff I'd expect??? And why is it reading a totally static value? There should logically be some noise in it.
 
Oznog said:
Huh. Looking at the pin #defines in my C code, I did use the LB4, the latch value, for SDO. Now I've never been sure I fully understood the difference. I have an output value now.

I stripped the last 6 bits of channel & parity, and get these values:
Applying 5V gives 0xc00000. It's a totally static value with no noise. Since bit 23 is a sign bit indicating 'positive', it's 0x40000, or 262144.

Grounding the input gave 0x801AD1, the exact value changes every sample. Stripping the sign bit gives 0x001ad1, or 6865.

So the value from giving gnd is small and looks like it'd be zero if not for ground noise. But why is the value when applying 5V so far from the 0x7fffff I'd expect??? And why is it reading a totally static value? There should logically be some noise in it.

The LTC2414/2418 is a bipolar converter. It can convert positive as well as negative voltages (wrt to the common or complement pin). If the voltage reference is 5V, the maximum voltage that will give a positive ADC reading is +2.5V (again wrt to the common or complement pin). The minimum voltage that will give a negative ADC reading is -2.5V. All inputs however should not exceed the VCC and GND rails.

You may have connected the common pin to GND. This limits the range of conversion from +2.5V to 0V. Any voltage above +2.5V will give an output of 0xC00000.
 
That was it. I didn't realize single ended conversions couldn't go over 2.5V unless the common connection had a bias. This sucks for part of my project, I wish they'd put that somewhere at the top of the spec sheet instead of buried in the middle.

So I have it working, but I can't reselect the channel. Whatever I select first persists despite repeat cycles. And yes it's setting the EN bit in the request. In fact it's using the same routine for the initial selection. The subroutine clocks in the 8 channel bits while clocking in the 32 data bits, it just tosses them out in the first case though since it's based on the default powerup channel.

Any ideas??
 
Nevermind- I looked at the SDI vs SCK and the code is not doing what it is supposed to do. I think I'm learning to hate HTSOFT PIC18, I've reviewed and debugged the C side of it and I see nothing wrong. But then it wouldn't be the first time I accused a software pkg of what turns out to be a stupid user error, though I don't think so this time.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top