To measure a 4-20mA current signal, a resistor should be wired inline to the low side of the power to the device (possibly supplied by the device itself), and analog in be wired to the high side of the resistor (see diagram). Selection of the value of resistance is a trade off between measurement precision, and the input voltage range of the device being powered (as well as the input range of the device).
Most of our devices have (automatic) dual ranges of 0-5V and 0-30V. It is preferred to use the 0-5V range, so to get the most precision, the resistance required would be 5V / 20mA = 250 ohms. The downside of this is that when the device is outputting 20mA (some devices may also output a slightly higher current to signal a fault), there will be a 5V drop across it's input voltage. So if the device was powered from 12V, it would now see 12 - 5V = 7V at its input. This may be outside of its operating range. To remedy this, either the input voltage may be increased (pay attention to the maximum input voltage ratings of the sensor), or the resistance can be decreased.