search_query=cat:astro-ph.*+AND+lastUpdatedDate:[202506252000+TO+202507012000]&start=0&max_results=5000
Non-Gaussian noise in gravitational-wave detectors, known as "glitches," can bias the inferred parameters of transient signals when they occur nearby in time and frequency. These biases are addressed with a variety of methods that remove or otherwise mitigate the impact of the glitch. Given the computational cost and human effort required for glitch mitigation, we study the conditions under which it is strictly necessary. We consider simulated glitches and gravitational-wave signals in various configurations that probe their proximity both in time and in frequency. We determine that glitches located outside the time-frequency space spanned by the gravitational-wave model prior and with a signal-to-noise ratio, conservatively, below 50 do not impact estimation of the signal parameters.