1

I was reading this paper(page 5 and 6) which finds critical dfa's (dfa's with shortest synchronising word with length exactly exactly equal to (n-1)^2) for n <= 7.

https://arxiv.org/pdf/1801.10436

However, I am having trouble following the branch and bound algorithm they proposed to search and prune the dfa's. I summarise the algorithm here.

They begin with the fact that a synchronising word of a dfa is equivalent to finding a path from Q(the whole set) to a singleton set in the power automaton of the dfa.

They develop a branch and bound algorithm where a node in the tree search represents a dfa with n states and its children represent an extension of the dfa with one extra alphabet.

At each node, they derive a upper bound on length of the synchronising word any extension of the dfa represented by current node can have. And if that upper bound is less than (n-1)^2 that branch is not explored further. Then,

If A = (Q, Σ, δ), then S ⊆ Q will be called reachable if there exists a word w ∈ Σ* such that Qw = S. We say that S is reducible if there exists a word w such that |Sw| < |S|, and we call w a reduction word for S.

The algorithm roughly runs as follows. We search for (super)critical DFAs on n states, so a DFA is discarded if it synchronizes faster, or if it does not synchronize at all. For a given DFA A = (Q, Σ, δ) which is not yet discarded or investigated, the algorithm does the following:

(1) If A is synchronizing and (super)critical, we have identified an example we are searching for.

(2) If A is synchronizing and subcritical, it is discarded, together with all its possible extensions (justified by Property 1).

(3) If A is not synchronizing, then find an upper bound L for how fast any synchronizing extension of A will synchronize (see below). If L < (n − 1)2 , then discard A and all its extensions. Otherwise, discard only A itself.

They first calculate all pair distance on the power automaton. Then,

A possible upper bound L is as follows:

(1) Determine the size |S| of a smallest reachable set S. Let m be the minimal distance from Q to a set of size |S|.

(2) For each k ≤ |S|, partition the collection of irreducible sets of size k into strongly connected components. Let mk be the number of components plus the sum of their diameters.

(3) For each reducible set R of size k ≤ |S|, find the length lR of its shortest reduction word. Let lk be the maximum of these lengths

(4) Now note that a synchronizing extension of A will have a synchronizing word of length at most

$$ L = m + \sum_{k = 2}^{|S|}(mk + lk) $$

I do understand the general idea of the algorithm.

It tries to iteratively jump from subsets of bigger size to smaller size to find a suitable upper bound.

In step 1, it tries to go from full set,Q, to smallest possible subset ,S. In case current dfa is not synchronising then |S| > 2.

In step 2, it is trying to see what is the longest time it can hop around in the current layer before reaching a reducible set.

In step 3, they try to see the most number of sets it takes to reduce a reducible set.

What I don't understand here is what should be done with lk if there is no reducible set for all sets of a given size. This is my main question.

Thank you for your help

1

0

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.