The 3.3V is exclusively for the digital outputs so they're compatible with the T4's max-input voltage limitation -- its pins are NOT 5V-safe.The very important thing is where the return current for the 3.3V is going. What is the 3.3V LDO powering? If it is DVDD on pin 19, then it should not be. The low end of DVDD is connected to the 0V GND ground plane used by analog, and ADC internal digital. It had better be clean as the driven snow, and not return to the Teensy, and as I read it, is not ever 3.3V. If the 3.3V does hang on a regulator, using from the analog 5V supply, and supplies pin 19 returning to a Teensy connected to pin 17, then I think we are likely to have trouble.
Taking from the 5V supply to make a 3.3V for pin 18 is almost worse, because the 0V end of the 5V supply had to find its way to OGND on pin 17, defeating the whole point of having a pin 17. I get it that the digital supply has to come from somewhere, and there is nothing wrong with regulating down from a computer digital 5V to 3,3V. The problem is contriving an analog supply that does not borrow from it. This is a problem for me also, and I may resort to battery for the analog supply.
[Edit: Checking the data sheet, DVDD on pin 19 is specified as 4.75V to 5.25V It is never 3.3V ]
If the 3.3V choice had anything to do with interfacing with the Teensy, then it's not right, and should not be applying 3.3V to Pin 19. That supply does need the isolation from AVDD by 20Ω, and is never 3.3V. If that is what you have, it might be the reason the internal digital circuits that return to DGND are not operating properly.
The Teensy interface is conveniently allowed to be 3.3V, or 5V. Whichever it is, it is applied on pin 18, returns from pin 17, and must not have any common return path that found DGND. This is the only way the ADC has of not adding the noise, and offset of the Teensy digital return.
I've been looking more closely at the ISR code and started wondering about the delays needed for properly accessing the data. There is a 45nS delay between CNV\ | RD\ going LOW and stable data-out levels; and the same delay applies between BYTESWAP and valid data. I've been doing it using a bunch of "mov rx,rx" commands -- they should be harmless -- but a little benchmarking program I wrote suggests that the compiler figures out that they're not really doing anything so it optimizes them out. If the delays aren't long enough for the data lines to settle that could explain a lot.
I've been experimenting with other approaches to obtain delays and about the best I've done so far is to use the processor's cycle counter, which runs at 600MHz but there still are some odd result which suggest the compiler is doing clever stuff there, too. Or perhaps it's the processor's ability to execute multiple bits of code in parallel when it can. Puzzling this one over, that's for sure