The timing period has to be exact, whatever you decide on. That is the entire basis of the concept. Basing it on scan time is not sufficient.
This is why it uses the interrupt in Vision, as it is exact and interrupts the scan, if necessary, at the exact time interval.
Someone else can perhaps comment on your logic, as I don't use Unistream. But your deviation observations match what will happen with the basic timing error, as it will increase proportionally the quicker the shaft spins.
I did something similar in Unilogic but I am not getting accurate results. Attached is the program itself.
On the first rung I am stacking up an MI that I am assuming its increasing at a rate of 1ms based on the scanning time.
On the second rung my proximity pulse clrears the first MI and stores its value to a second MI.
Then I am converting my stacked up MI into milliseconds to get the period and converting them to seconds for the final RPM calculation. (RPM = 60/T)
My test rig is a motor at steady speed rotating a shaft that I get proximity pulses from. Testing with various instruments I know I have 31 rpms but my program outputs 40 rpm. And I also noticed that at higher speeds the deviation is larger whereas at lower speeds the deviation is smaller.