I'm still not completely sure how to interpret this. 3% of 20mA is 0.6mA, or in 'raw-value' it is about 30 (of 1024). If I look at the measured analogue input, it's value is quite constant. So, and perhaps this conclusion is wrong, the 30 is not an error that happens from measurement to measurement. Is it an error between models?