Needing more than a spark test?

The very important thing is where the return current for the 3.3V is going. What is the 3.3V LDO powering? If it is DVDD on pin 19, then it should not be. The low end of DVDD is connected to the 0V GND ground plane used by analog, and ADC internal digital. It had better be clean as the driven snow, and not return to the Teensy, and as I read it, is not ever 3.3V. If the 3.3V does hang on a regulator, using from the analog 5V supply, and supplies pin 19 returning to a Teensy connected to pin 17, then I think we are likely to have trouble.

Taking from the 5V supply to make a 3.3V for pin 18 is almost worse, because the 0V end of the 5V supply had to find its way to OGND on pin 17, defeating the whole point of having a pin 17. I get it that the digital supply has to come from somewhere, and there is nothing wrong with regulating down from a computer digital 5V to 3,3V. The problem is contriving an analog supply that does not borrow from it. This is a problem for me also, and I may resort to battery for the analog supply.

[Edit: Checking the data sheet, DVDD on pin 19 is specified as 4.75V to 5.25V It is never 3.3V ]

If the 3.3V choice had anything to do with interfacing with the Teensy, then it's not right, and should not be applying 3.3V to Pin 19. That supply does need the isolation from AVDD by 20Ω, and is never 3.3V. If that is what you have, it might be the reason the internal digital circuits that return to DGND are not operating properly.

The Teensy interface is conveniently allowed to be 3.3V, or 5V. Whichever it is, it is applied on pin 18, returns from pin 17, and must not have any common return path that found DGND. This is the only way the ADC has of not adding the noise, and offset of the Teensy digital return.
The 3.3V is exclusively for the digital outputs so they're compatible with the T4's max-input voltage limitation -- its pins are NOT 5V-safe.

I've been looking more closely at the ISR code and started wondering about the delays needed for properly accessing the data. There is a 45nS delay between CNV\ | RD\ going LOW and stable data-out levels; and the same delay applies between BYTESWAP and valid data. I've been doing it using a bunch of "mov rx,rx" commands -- they should be harmless -- but a little benchmarking program I wrote suggests that the compiler figures out that they're not really doing anything so it optimizes them out. If the delays aren't long enough for the data lines to settle that could explain a lot.

I've been experimenting with other approaches to obtain delays and about the best I've done so far is to use the processor's cycle counter, which runs at 600MHz but there still are some odd result which suggest the compiler is doing clever stuff there, too. Or perhaps it's the processor's ability to execute multiple bits of code in parallel when it can. Puzzling this one over, that's for sure
 
Teensyduino does have a delayNanoseconds() function but my benchmarking has shown that it isn't at all accurate in the time frame I'm interested in. delayNanoseconds(16) actually give me a bit more than a 40nS delay. The delay is highly quantized as well, but that's not surprising since the CPU cycle counter is "only" running at 600MHz, about 1.67nS/cycle. The loop overhead clearly increases the delay time as well.
 
The 3.3V is exclusively for the digital outputs so they're compatible with the T4's max-input voltage limitation -- its pins are NOT 5V-safe.
Yes - I get that. That is why the 3.3V is given to pin 18, so the Teensy never sees anything higher. It stays safe!
The difficulty is that, via the 3.3V regulator, it is derived from the 5V that is supplying the OVA that must not get connected, via the negative end return, to pin 17, which is connected to the Teensy.
 
Yes - I get that. That is why the 3.3V is given to pin 18, so the Teensy never sees anything higher. It stays safe!
The difficulty is that, via the 3.3V regulator, it is derived from the 5V that is supplying the OVA that must not get connected, via the negative end return, to pin 17, which is connected to the Teensy.
Of course, the LDO HAS to connect to the common ground at some point; but it looks like I connected the LDO's ground pin to AGND, not DGND. Same for its output bypass capacitors. Still, right now I'm fighting those timing issues to make the ADC happy so it's hard to tell exactly what's the largest contributor to the apparent noise/offset issue. I do know that messing around with the timing really moves the noise level around.

I think I need to fire up my oscilloscope to do some troubleshooting. So far I've just been looking at the data coming out of the ADC....with the exception of a bit of checking with my DVM to see if there's something _really_ bad going on. So far so good there.
 
Of course, the LDO HAS to connect to the common ground at some point; but it looks like I connected the LDO's ground pin to AGND, not DGND. Same for its output bypass capacitors. Still, right now I'm fighting those timing issues to make the ADC happy so it's hard to tell exactly what's the largest contributor to the apparent noise/offset issue. I do know that messing around with the timing really moves the noise level around.

I think I need to fire up my oscilloscope to do some troubleshooting. So far I've just been looking at the data coming out of the ADC....with the exception of a bit of checking with my DVM to see if there's something _really_ bad going on. So far so good there.
Like trying to follow a race car...

Doesn't seem Mouser carries the 3.3V LDO by that SC662K-3.3V part number (or anything like it). I do have some XC6206 3.3V LDO parts in SOT23-3 that seem like it would fit. Shares the same pin out.

Seems like I should wait a little, at least until you figure out the timing issues. Have plenty to do - have to learn how to make new parts in KiCAD.
 
Of course, the LDO HAS to connect to the common ground at some point; but it looks like I connected the LDO's ground pin to AGND, not DGND. Same for its output bypass capacitors. Still, right now I'm fighting those timing issues to make the ADC happy so it's hard to tell exactly what's the largest contributor to the apparent noise/offset issue. I do know that messing around with the timing really moves the noise level around.

I think I need to fire up my oscilloscope to do some troubleshooting. So far I've just been looking at the data coming out of the ADC....with the exception of a bit of checking with my DVM to see if there's something _really_ bad going on. So far so good there.
You are way ahead of us in winkling out the stuff so that we don't have to trip over it as well.
When I did this at work, we had the luxury of isolated independent power supplies. We never tried to make a supply for a digital interface put it's return currents common mode through a analog ADC path. This whole thing is having me revise my design to somehow separate the analog supply.

I take it that the LDO is supplying the digital interface stuff at pin 18. The digital supply 3.3V LDO does not have to connect to the analog ground unless it took it's input from the analog 5V, and that is something ADC designers strive to prevent. The entire top of the connection diagram is all about the separation of the analog supply (5V) on the left from the digital (interface) supply (5V or 3.3V) on the right. Where the ADC absolutely has to set the "common" point, it is carefully made at DGND pins 20, and 30, and does not share with the computer interface powering arrangements.

In implementing ADCs, there is no concept of a "common" ground between analog and digital (interface). There is, of course, one carefully controlled common point internal to the chip, at DGND, but that is not the same thing as OGND. The way the numbers end up in registers internal to the ADC is designed to get them there with as low energy as possible, using DGND. The computer interface that has it's returning currents go to to OGND at pin 17, can be quite rowdy, go at high speed, without contaminating the analog GND.

To be able to read analog signals, especially the parts of the waveform at low level, requires the analog GND does not have to carry common mode clocking and other digital return currents that connect to a computer.

Yes - I also have to figure a way to do this, involving common mode choke and filter.
 
Last edited:
UK Postal Cyber Attack!
Yep - it kinda messed me me up! My apologies to Bruce. :(

ttps://www.express.co.uk/finance/city/1730466/cyber-attack-royal-mail-small-business-overseas-post

I have been soaking in much that Analog Devices has published about interfacing to ADCs, and I believe my concerns are justified. The 0V end of the analog input is just as sensitive to noise as the main input. The PSRR (Power Supply Rejection Ratio) is impressive, but useless against bumping the low end of the input.

Working on some solutions.
Recognizing that we need the convenience of deriving one supply from another, I think I have a few solutions in mind. My ADC (AD7622) uses 2.5V, which is a step-down to analog, whereas the AD7667 has a 5V analog. We need to power a 3.3V data interface, without messing up the sensitive side that Linear Tech tried so hard to separate. Even so, both kinds can be accommodated.

One can prevent data interface, and particularly the outside computer kit from adding it's return pulse currents and coupled noise into the sensitive analog plane. Partly about tracking and split ground planes, partly about filtering. Not as good as isolated supplies, but goes a long way to getting ADCs to have the LSB 8 bits become meaningful. Alternatively one can go for properly isolated supplies, which in out case, might be a tiny, high frequency, switcher. High frequency because that makes the filtering components also tiny, and more effective.

Switchers
The switcher used in the Pocket Geiger was horrible, but the concept was not. It turns out that provided the frequency is high enough, the required current low enough, they can be had very clean, particularly when followed by a low noise, high bandwidth regulator. In our case, the current requirement is milliamps, not amps, so we can aspire to something low cost, and effective. I am urgently working on this.

I am learning so much in attempting this, but of course, I want it done, and I want to catch up.
For this phase of testing, it may help to use a temporary bench supply for the 3.3V, that returns only to the pin 17 and get to solving clocking out the data issues.
 

Attachments

Last edited:
UK Postal Cyber Attack!
Yep - it kinda messed me me up! My apologies to Bruce. :(

ttps://www.express.co.uk/finance/city/1730466/cyber-attack-royal-mail-small-business-overseas-post

I have been soaking in much that Analog Devices has published about interfacing to ADCs, and I believe my concerns are justified. The 0V end of the analog input is just as sensitive to noise as the main input. The PSRR (Power Supply Rejection Ratio) is impressive, but useless against bumping the low end of the input.

Working on some solutions.
Recognizing that we need the convenience of deriving one supply from another, I think I have a few solutions in mind. My ADC (AD7622) uses 2.5V, which is a step-down to analog, whereas the AD7667 has a 5V analog. We need to power a 3.3V data interface, without messing up the sensitive side that Linear Tech tried so hard to separate. Even so, both kinds can be accommodated.

One can prevent data interface, and particularly the outside computer kit from adding it's return pulse currents and coupled noise into the sensitive analog plane. Partly about tracking and split ground planes, partly about filtering. Not as good as isolated supplies, but goes a long way to getting ADCs to have the LSB 8 bits become meaningful. Alternatively one can go for properly isolated supplies, which in out case, might be a tiny, high frequency, switcher. High frequency because that makes the filtering components also tiny, and more effective.

Switchers
The switcher used in the Pocket Geiger was horrible, but the concept was not. It turns out that provided the frequency is high enough, the required current low enough, they can be had very clean, particularly when followed by a low noise, high bandwidth regulator. In our case, the current requirement is milliamps, not amps, so we can aspire to something low cost, and effective. I am urgently working on this.

I am learning so much in attempting this, but of course, I want it done, and I want to catch up.
For this phase of testing, it may help to use a temporary bench supply for the 3.3V, that returns only to the pin 17 and get to solving clocking out the data issues.
Something like this might help resolve some of the issues we're dealing with.
 
Something like this might help resolve some of the issues we're dealing with.
A USB isolator? Or are you thinking general optical isolation? Or did I miss the point?

I don't think it's necessary, if it was, wouldn't places like ADI recommend it?
 
A USB isolator? Or are you thinking general optical isolation? Or did I miss the point?

I don't think it's necessary, if it was, wouldn't places like ADI recommend it?
They probably don't think anyone would be so foolish as to use a USB connection to provide power and signal transfer to/from a 16-bit ADC :rolleyes: .

Really, the only reason for having a USB connection is to transfer data (and in the case of developing Teensy code) to program the processor.
 
Back
Top