The influence of seismic history on the liquefaction resistance of saturated sand is a complex process that remains incompletely understood. Large earthquakes often consist of foreshocks, mainshocks, and aftershocks with varying magnitudes and irregular time intervals. In this context, sandy soils undergo two interdependent processes: (i) partial excess pore water pressure (EPWP) generation during foreshocks or moderate mainshocks, where seismic loadings elevate EPWP without causing full liquefaction and (ii) incomplete EPWP dissipation between seismic events due to restricted drainage. These processes leave behind persistent residual EPWP, reducing the liquefaction resistance during subsequent shaking. A series of cyclic triaxial tests simulating these mechanisms revealed that liquefaction resistance increases when the EPWP ratio r(u) < 0.6-0.8 (peaking at r(u) similar to 0.4) but decreases sharply at higher r(u). Crucially, EPWP generation during seismic loading plays a dominant role in resistance evolution compared to reconsolidation effects. Threshold lines (TLs) mapping r(u), the reconsolidation ratio (RR), and peak resistance interval (the range of r(u) where the peak liquefaction resistance is located) indicates that resistance decreases above TLs and increases below them, with higher cyclic stress ratios (CSR) weakening these effects. These findings provide a unified framework for assessing liquefaction risks under realistic multi-stage seismic scenarios.