Page 1 of 1

Raw ADC data to volts conversion

PostPosted: Wed May 17, 2017 5:01 am
by lvdev
Hi,
I'm developing an application in NI LabVIEW which acquires raw ADC data from VT DSO 2A20E device and stores it in a file, then it needs to convert the raw data to volts for further processing.
I'm aware of a possibility to acquire calibrated data, but I need to acquire raw data first due to performance and data storage constraints.
I have two questions:
1. What formula shall be used to convert raw ADC data values to volts, based on (ADC range-specific) Offset, Gain, Gain (dB), bit depth (and maybe something else)?
2. Is there a VT API function call I can use to query the device for its calibration data (i.e. Offset*, Mag* and DB*)? vtDAQ API document doesn't seem to mention anything like that...
Thank you!

Re: Raw ADC data to volts conversion

PostPosted: Wed May 17, 2017 3:18 pm
by VirtinsTech
Thank you for your questions.

For LabVIEW development, vtDAQLV.dll and vtDAOLV.dll should be used. These DLLs can be found in the software's installation directory\DAQDAOAPIs\TestDAQLabViewAndCVI.zip and TestDAOLabViewAndCVI.zip respectively. The sample codes and API document can also be found there.

The pointer to raw data array and the pointer to calibrated data array can be passed to vtDAQLV.dll through DAQLV_SetDAQData API (see figure below, refer also to Section 2.4.5 of the vtDAQ and vtDAO API document).
API.png
API.png (63.15 KiB) Viewed 129803 times


If the calibrated data are not needed, let CalibrationMode=0 and pCalibratedData=NULL.

The raw data returned by vtDAQ API are independent from the device's internal calibration parameters such as Offset, Gain, Gain(dB), etc. The raw data can be converted to calibration data based on only two parameters:

(1) SamplingBitResolution
(2) SamplingParameters.HighLimit - SamplingParameters.LowLimit

In other words, the voltage range (HighLimit -LowLimit) will just take up the whole range allowed by the bit resolution.

Note that for 8-bit data, the data are stored in the record buffer as unsigned values; for 16-bit, 24-bit, and 32-bit data, the data are stored in the record buffer as signed values. This conforms to the data format in a wave file. Therefore it can be readily stored in a wave file without converting the data format. (refer to Section 2.1.3 of the vtDAQ and vtDAO API document).

The following C codes are what is done inside the vtDAQLV.dll to convert the raw data to calibration data.

//pd: Pointer to calibrated data array
//DAQBuffer: Pointer to raw data array
//BufferLength: Size of raw data array in bytes

void DAQBufferToChannelData(double *pd,char * DAQBuffer,unsigned long BufferLength)
{
long i,count,start;
union CH4WORD2DWORD1 {char ch[4];WORD w[2];DWORD dw;} chw1;
unsigned long nBlockAlign;
double scale1, scale2;

//transfer data from DAQ buffer to channel data
count=0;
nBlockAlign=SamplingParameters.SamplingChannels*((SamplingParameters.SamplingBitResolution+7)/8);

if(CalibrationMode==1) //interleaved
{
start=0;
}
else //channel by channel
{
start=BufferLength/nBlockAlign;
}
scale1=SamplingParameters.HighLimit[0]*2/pow(2,SamplingParameters.SamplingBitResolution);
scale2=SamplingParameters.HighLimit[1]*2/pow(2,SamplingParameters.SamplingBitResolution);

switch (SamplingParameters.SamplingChannels)
{
case 1:
switch(SamplingParameters.SamplingBitResolution)
{
case 8:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
pd[count]=(unsigned char)DAQBuffer[i];
pd[count]-=0x80;
pd[count]*=scale1;
count++;
}
break;
case 16:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
chw1.ch[0]=DAQBuffer[i];
chw1.ch[1]=DAQBuffer[i+1];
chw1.w[0]+=0x8000;
pd[count]=chw1.w[0];
pd[count]-=0x8000;
pd[count]*=scale1;
count++;
}
break;
case 24:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
chw1.ch[0]=DAQBuffer[i];
chw1.ch[1]=DAQBuffer[i+1];
chw1.ch[2]=DAQBuffer[i+2];
chw1.dw=(chw1.dw+0x800000)&0xFFFFFF;
pd[count]=chw1.dw;
pd[count]-=0x800000;
pd[count]*=scale1;
count++;
}
break;
case 32:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
chw1.ch[0]=DAQBuffer[i];
chw1.ch[1]=DAQBuffer[i+1];
chw1.ch[2]=DAQBuffer[i+2];
chw1.ch[3]=DAQBuffer[i+3];
chw1.dw+=0x80000000;
pd[count]=chw1.dw;
pd[count]-=0x80000000;
pd[count]*=scale1;
count++;
}
break;
}
break;
case 2:
switch(SamplingParameters.SamplingBitResolution)
{
case 8:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
pd[count]=(unsigned char)DAQBuffer[i];
pd[count]-=0x80;
pd[count]*=scale1;
if(CalibrationMode==1) count++;

pd[start+count]=(unsigned char)DAQBuffer[i+1];
pd[start+count]-=0x80;
pd[start+count]*=scale2;
count++;
}
break;
case 16:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
chw1.ch[0]=DAQBuffer[i];
chw1.ch[1]=DAQBuffer[i+1];
chw1.w[0]+=0x8000;
pd[count]=chw1.w[0];
pd[count]-=0x8000;
pd[count]*=scale1;
if(CalibrationMode==1) count++;

chw1.ch[0]=DAQBuffer[i+2];
chw1.ch[1]=DAQBuffer[i+3];
chw1.w[0]+=0x8000;
pd[start+count]=chw1.w[0];
pd[start+count]-=0x8000;
pd[start+count]*=scale2;
count++;
}
break;
case 24:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
chw1.ch[0]=DAQBuffer[i];
chw1.ch[1]=DAQBuffer[i+1];
chw1.ch[2]=DAQBuffer[i+2];
chw1.dw=(chw1.dw+0x800000)&0xFFFFFF;
pd[count]=chw1.dw;
pd[count]-=0x800000;
pd[count]*=scale1;
if(CalibrationMode==1) count++;

chw1.ch[0]=DAQBuffer[i+3];
chw1.ch[1]=DAQBuffer[i+4];
chw1.ch[2]=DAQBuffer[i+5];
chw1.dw=(chw1.dw+0x800000)&0xFFFFFF;
pd[start+count]=chw1.dw;
pd[start+count]-=0x800000;
pd[start+count]*=scale2;
count++;
}
break;
case 32:
for(i=0;i<BufferLength;i+=nBlockAlign)
{
chw1.ch[0]=DAQBuffer[i];
chw1.ch[1]=DAQBuffer[i+1];
chw1.ch[2]=DAQBuffer[i+2];
chw1.ch[3]=DAQBuffer[i+3];
chw1.dw+=0x80000000;
pd[count]=chw1.dw;
pd[count]-=0x80000000;
pd[count]*=scale1;
if(CalibrationMode==1) count++;

chw1.ch[0]=DAQBuffer[i+4];
chw1.ch[1]=DAQBuffer[i+5];
chw1.ch[2]=DAQBuffer[i+6];
chw1.ch[3]=DAQBuffer[i+7];
chw1.dw+=0x80000000;
pd[start+count]=chw1.dw;
pd[start+count]-=0x80000000;
pd[start+count]*=scale2;
count++;
}
break;
}
break;
}
}

Re: Raw ADC data to volts conversion

PostPosted: Thu May 18, 2017 1:58 am
by lvdev
Thank you very much for such a detailed and quick response!

So, just to make sure I understand it right - the hardware calibration seem to be applied to the signal before it is digitized by the ADC, so as long as the parameters of the hardware calibration are set correctly in the hardware, all ranges are mapped to the "SamplingParameters.SamplingBitResolution" number of bits exactly; thus the only thing left to convert the raw data to volts is to use the conversion factor derived from the range value and the bit depth of the raw data (?).

Re: Raw ADC data to volts conversion

PostPosted: Thu May 18, 2017 9:47 am
by VirtinsTech
Yes, you are right. The hardware calibration parameters are stored inside the hardware device (not the computer), and are applied to the signal conditioning circuit before it is digitized by the ADC, in order to fully utilize the dynamic range of the ADC chip.

Re: Raw ADC data to volts conversion

PostPosted: Sat May 20, 2017 3:31 am
by lvdev
Thank you very much!