I understand the methodological purpose of inserting simulated transits into the data set. However, I think it is less than helpful when they are inserted into #VARIABLE or #GLITCH lightcurves. In those cases, the simulated transit is easily lost in the direct middle of the actual natural lightcurve of a variable or amongst all the noise from a glitch. Those conditions make it impossible to discern the supposed transit and can result in false negatives for the simulated transits program. It would be useful if the simulated transits were only added to the lightcurve of constant stars or at least a method were developed of ensuring that simulated transits were only added below the lightcurve of variable stars. If possible, it would be best to keep simulated transits out of glitches entirely.