Humans have the ability to walk with deceptive ease, navigating everything from daily environments to uneven and uncertain terrain with efficiency and robustness. With the goal of achieving human-like abilities on robotic systems, this talk presents the process of formally achieving bipedal robotic walking through controller synthesis inspired by human locomotion, and it demonstrates these methods through experimental realization on numerous bipedal robots and robotic assistive devices. Motivated by the hierarchical control present in humans, human-inspired virtual constraints are utilized to synthesize a novel type of control Lyapunov function (CLF); when coupled with hybrid system models of locomotion, this class of CLFs yields provably stable robotic walking. Going beyond explicit feedback control strategies, these CLFs can be used to formulate an optimization-based control methodology that dynamically accounts for torque and contact constraints while being implementable in real-time. This sets the stage for the unification of control objectives with safety-critical constraints through the use of a new class of control barrier functions provably enforcing these constraints. The end result is the generation of bipedal robotic walking that is remarkably human-like and is experimentally realizable, together with a novel control framework for highly dynamic behaviors on bipedal robots. Furthermore, these methods form the basis for achieving a variety of advanced walking behaviors—including multi-domain locomotion, e.g., human-like heel-toe behaviors—and therefore have application to the control of robotic assistive devices, as evidenced by the demonstration of the resulting controllers on multiple robotic walking platforms, humanoid robots and prostheses.