Every language recognized by a non-deterministic finite automaton can be recognized by a deterministic automaton, at the cost of a potential increase of the number of states, which in the worst case can go from $n$ states to $2^n$ states. In this article, we investigate this classical result in a probabilistic setting where we take a deterministic automaton with $n$ states uniformly at random and add just one random transition. These automata are almost deterministic in the sense that only one state has a non-deterministic choice when reading an input letter. In our model, each state has a fixed probability to be final. We prove that for any $d\geq 1$, with non-negligible probability the minimal (deterministic) automaton of the language recognized by such an automaton has more than $n^d$ states; as a byproduct, the expected size of its minimal automaton grows faster than any polynomial. Our result also holds when each state is final with some probability that depends on $n$, as long as it is not too close to $0$ and $1$, at distance at least $\Omega(\frac1{\sqrt{n}})$ to be precise, therefore allowing models with a sublinear number of final states in expectation.
翻译:非确定性有限自动机识别的任何语言均可由确定性自动机识别,代价是状态数可能增加,最坏情况下可从 $n$ 个状态增至 $2^n$ 个状态。本文在概率框架下研究这一经典结论:我们随机均匀选取一个具有 $n$ 个状态的确定性自动机,并仅添加一条随机转移边。这类自动机几乎是确定性的,因为仅有一个状态在读取输入字母时存在非确定性选择。在我们的模型中,每个状态具有固定的终止概率。我们证明对于任意 $d\geq 1$,此类自动机所识别语言的最小(确定性)自动机以不可忽略的概率具有超过 $n^d$ 个状态;作为推论,其最小自动机的期望规模以超越任何多项式的速度增长。当每个状态的终止概率与 $n$ 相关(只要不无限趋近于 $0$ 或 $1$,精确而言需保持至少 $\Omega(\frac1{\sqrt{n}})$ 的距离)时,我们的结论依然成立,从而允许期望终止状态数为次线性的模型。