Interact Posted June 15, 2012 Report Share Posted June 15, 2012 I was reading the specifications for a Jazz (this one: http://www.unitronic...SPEC_ 06-06.pdf). On page 2 it mentions that the precision is 3%. 3% of what? Link to comment Share on other sites More sharing options...
AlexUT Posted June 17, 2012 Report Share Posted June 17, 2012 Hi, Meaning is +-3% of full range (20 mA or 10 V). B.R. Link to comment Share on other sites More sharing options...
Interact Posted June 20, 2012 Author Report Share Posted June 20, 2012 I'm still not completely sure how to interpret this. 3% of 20mA is 0.6mA, or in 'raw-value' it is about 30 (of 1024). If I look at the measured analogue input, it's value is quite constant. So, and perhaps this conclusion is wrong, the 30 is not an error that happens from measurement to measurement. Is it an error between models? Link to comment Share on other sites More sharing options...
Emil Posted June 20, 2012 Report Share Posted June 20, 2012 Hi, This means, that if the absolute precise value, measured with the most precise instruments is, let's say 500, the controller can show value 500 +/- 30. Or - if you know for sure, that your value is 500, but you read for example 471 or 529, this means your value is legal and within the limits of the error. Usually analog inputs are callibrated individually in gte factory and give higher precision than stated. Small tip - if you need higher precision, you can use Linearization in Ladder to correct/callibrate the reading. Link to comment Share on other sites More sharing options...
Interact Posted June 20, 2012 Author Report Share Posted June 20, 2012 But is it a device specific deviation? So if it reads 480 (for a 500 signal) does that mean it will read 480 every time? Because, as said, that is simply a matter of calibrating. However if the same device will give 480 one time and 520 the next then I have a problem... Link to comment Share on other sites More sharing options...
MVP 2023 Joe Tauser Posted June 21, 2012 MVP 2023 Report Share Posted June 21, 2012 Here is a nice web page that explains the actual meaning of the terms by an instrumentation engineer- http://www.tutelman....e/precision.php The terms used to define an instrumentation input usually start with accuracy, and then sometimes throw in linearity and repeatability for good measure. By definition, +/- 3% precision means the reading can vary 30 counts from instance to instance of the same electrical stimulus. I know from experience that Unitronics analog inputs are much better than this. It is really a question for the Creators to explain how good the analog inputs are using more specific terminology. I think what Emil was explaining was accuracy, which means you may be off from the mark but you need to know the repeatability to know if you will be off by the same amount every time. Rebuttal? Joe T. Link to comment Share on other sites More sharing options...
Transfo Posted September 19, 2012 Report Share Posted September 19, 2012 Hi, New problem with two last deliveries JZ10-11-T17 SN:MG1100411 and MG1100155 (new hardware?). I apply the same signal on AI2 and AI3: 4VDC +/-7mV ripple 50Hz Analog input 2 is unstable at +/-1% or more while AI3 is stable +/-1digit. That makes the reading impossible. It is a bit more better with filter set to High but just slower, the amplitude is quite the same. Is this a sign of fragile converter? (I have to ship in china) If normal can you supply a ladder program example to filter? Thanks for help Daniel Link to comment Share on other sites More sharing options...
AlexUT Posted September 19, 2012 Report Share Posted September 19, 2012 Hi Daniel, To make discussion more productive, I suggest to use "shortest path first" method. ADC(Analog to Digital Converter) use successive approximation. Conversion time is 20 mSec. This mean that during conversion time input voltage should be the same. I used V350 to generate output voltage which I supply to JZ10-11-T17 AI 2 and AI 3. Input of 5.0V measured by Fluke: AI 2 show 511/512/513 (no filter) AI 3 show 512/513 (no filter) Input of 10.0V measured by Fluke: AI 2 show 1020(sometimes)1022/1023 (no filter) AI 3 show 1020/1022 (no filter) It seems that your problem is originated by ripple. If you have no other PLC to generate Vout, use battery for test. You can use Low/Medium/High filter in Hardware settings, or Special Function "Find Mean, Maximum, and Minimum Values". * You did not mentioned your application. Knowing this will help to give you more specific recommendations. You can send your request direct to support@unitronics.com BR Link to comment Share on other sites More sharing options...
Transfo Posted September 19, 2012 Report Share Posted September 19, 2012 Short-circuited inputs to 0V AI 2 show 0/2/4 (no filter) AI 3 show 0 (no filter) What is the resolution of the converter? sometimes 1/1024 and sometimes 2/1024 AI 2 show 0/2 (high filter) Where do I find the Special Functions? My application is HV strength tester 5000V 500mA Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now