The problem of sequential change diagnosis is considered, where a sequence of independent random elements is accessed sequentially. At some unknown time there is an abrupt change in its distribution, and there are two main operational goals: to quickly detect the change and to accurately identify the post-change distribution among a finite set of alternatives. We consider a standard change-point detection algorithm that raises an alarm as soon as the CuSum statistic that corresponds to one of the post-change alternatives exceeds a certain threshold. This procedure is shown to control the worst-case conditional probability of false isolation under some conditions, and at the same time minimize Lorden’s criterion for the detection delay, for every possible post-change distribution, to a first-order asymptotic approximation as both the worst-case probability of false isolation and the false alarm rate go to zero, but with the false alarm rate going to zero at a faster rate. Specifically, it is shown that these properties are satisfied under some conditions when the post-change distributions are more distant (in a Kullback-Leibler divergence sense) from one another than from the pre-change distribution.
Nystrom methods are broadly applied to accelerate the computation involving kernel matrices, whose performance are mainly decided by the sketching matrices in use. We first propose a new framework to unify the common sub-sampling sketching and the gaussian sketching. We also explore how in general the Nystrom methods can be extended to the popular neural networks, transformers.
616 E Green St. suite 213
Lunch RSVP Form (Please Fill the form by the end of Wednesday March 23th)