In this paper, we propose the Whiplash Inertial Gradient dynamics, a
closed-loop optimization method that utilises gradient information, to find the
minima of a cost function in finite-dimensional settings. We introduce the
symplectic asymptotic convergence analysis for the Whiplash system for convex
functions. We also introduce relaxation sequences to explain the non-classical
nature of the algorithm and an exploring heuristic variant of the Whiplash
algorithm to escape saddle points, deterministically. We study the algorithm’s
performance for various costs and provide a practical methodology for analyzing
convergence rates using integral constraint bounds and a novel Lyapunov rate
method. Our results demonstrate polynomial and exponential rates of convergence
for quadratic cost functions.