The inherent difficulty of many problems of combinatorial optimization and graph theory stems from the discrete nature of the domains in which these problems are posed. Controlled Markov Chains, Graphs & Hamiltonicity summarizes a line of research that maps such problems into convex domains where continuum, dynamic and perturbation analyses can be more easily carried out. The convexification of domains is achieved by assigning probabilistic interpretation to key elements of the original problems even though these problems are deterministic.

The dynamics are introduced via a controller whose choices select points, or trajectories in these domains. Singular perturbations are introduced as tools to simplify the structure of certain Markov processes. The above approach is illustrated by its application to one famous problem of discrete mathematics: the Hamiltonian Cycle Problem (HCP). The essence of HCP is contained in the following, deceptively simple, single sentence statement: given a graph on N nodes, find a simple cycle that contains all vertices of the graph (Hamiltonian Cycle) or prove that such a cycle does not exist.

The HCP is known to be NP-hard and has become a challenge that attracts mathematical minds both in its own right and because of its close relationship to the equally famous Traveling Salesman Problem (TSP). An efficient solution of the latter would have an enormous impact in operations research, optimization and computer science. However, from a mathematical perspective the underlying difficulty of the TSP is, perhaps, hidden in the Hamiltonian Cycle Problem that is the main focus here.