The degradation of ultrathin oxides subjected to constant-current stresses is analyzed using two independent procedures. First, the injected charge to breakdown is estimated from the stress-induced leakage current (SILC) evolution during the stress. Second, the degradation that leads to the breakdown is directly measured using a two-step stress test. The evolution of the SILC during constant-current stresses proceeds at a rate that decreases with time. Moreover, under low current density stress conditions the normalized SILC at breakdown is no longer constant. However, our two-step test methodology shows that the degradation of the oxide evolves roughly linearly until the breakdown. These apparently contradictory results can be reconciled assuming that the degradation at breakdown is independent of the stress conditions and using the initial SILC generation rate to calculate the charge-to-breakdown by linear extrapolation. The implications for the use of SILC data as a degradation monitor are discussed.