Hi there! I'm currently working on a project that deals with temperature controlled water and have been testing T type, grounded tip, thermocouples on a V700 with V200-18-E4XB snap-in I/O. I've been getting some results which I'm struggling to reconcile and was wondering if anyone had some insight...
I recognized there is was offset between actual temperature and the thermocouple reading. I could add this offset in logic but after testing across my measured range (15 degree C to 60 degree C), I noticed the offset is not linear. When measuring low end temps (20 degree C), there is about a 5 degree C offset, and by the time you reach 40 degree C this has been reduced to about 2 degree C. A simple answer may be to program a different offset for different ranges but that seems like a weak solution that I would not feel confident with. I understand that there is about a 30 min cold junction warm up time and all my data was collected well after that window. I also understand that thermos do have a limit to their accuracy but I don't think I'm at the point where I'm asking to much of them yet.
I've heard of people using thermocouple transmitters which can be tuned and then sent to an analog input but that seems like a waste since the PLCs already have direct T/C reading capabilities, but maybe this is the only way to get the accuracy I want?
Lastly, I read in the instruction manual that every attached thermocouple adds 100ms to my read time and I just want to make sure I'm understanding this correctly - if I were to use 4 T/Cs my read time is going to be 400ms? And if this is the case I wondering if there's any way to improve this.
All of this is relatively new waters for me so any help with the above issues would be greatly appreciated!