Wave-union-based Time-to-Digital converters are one of the few emerging technologies that allow event time tagging in the range of a few picoseconds and below. So far, the most effort in implementing such a technology has been done using Field Programmable Gate Arrays (FPGAs) due to their logic reconfigurability and cost efficiency. The main component of a wave-union time-to-digital converter (TDC) is a tapped delay line (TDL) which consists of logic elements that act as fixed delays. Due to high dependency of FPGA on such factors as process-voltage-temperature (PVT) changes, clock-network jitter and device-to-device discrepancies, the behavior of delays of the logic elements over a course of measurements offer some degree of nonlinearity.This work focuses on the time delays quantization stability and effects of nonlinear delay distribution for a wave-union core developed for a Microchip’s IGLOO2, a general-purpose field-programmable gate array (FPGA). It is demonstrated that while controlling core voltage supply and external factors such as electro-static discharge (ESD) and temperature to be at a constant low level, the nonlinearity was still present in the processed data in the form of large random spikes, but the instability of estimated bin delays has decreased.