The essence of the work that I am doing can be encapsulated under the umbrella term *categorical cybernetics*.

Most of these papers assume knowledge of category theory, most notably that of lenses/optics.

## Categorical Foundations of Gradient-Based Learning

We propose a categorical semantics of gradient-based machine learning algorithms in terms of lenses, parametrised maps, and reverse derivative categories. This foundation provides a powerful explanatory and unifying framework: encompassing a variety of neural networks, loss functions (including mean squared error and Softmax cross entropy), gradient update algorithms (including Nesterov momentum, Adagrad and ADAM). It generalises beyond the familiar continuous domains (modelled in categories of smooth maps) to the discrete setting of boolean circuits.

## Towards foundations of categorical cybernetics

We propose a categorical framework for ‘cybernetics’: processes which interact bidirectionally with both an environment and a ‘controller’. Examples include open learners, in which the controller is an optimiser such as gradient descent, and open games, in which the controller is a composite of game-theoretic agents.

## Category Theory in Machine Learning: a Survey

Over the past two decades machine learning has permeated almost every realm of technology. At the same time, many researchers have begun using category theory as a unifying scientific language, facilitating communication between different disciplines. It is therefore unsurprising that there is a burgeoning interest in applying category theory to machine learning. We document the motivations, goals and common themes across these applications, touching on gradient-based learning, probability, and equivariant learning.

## Compositional Game Theory, Compositionally

We present a new compositional approach to compositional game theory based upon Arrows, a concept closely related to Tambara modules. We use this compositional approach to show how known and previously unknown variants of open games can be proven to form symmetric monoidal categories.

## Learning Functors using Gradient Descent

We build a category-theoretic formalism around a neural network system called CycleGAN, an approach to unpaired image-to-image translation. We relate it to categorical databases, and show that a special class of functors can be learned using gradient descent. We design a novel neural network capable of inserting and deleting objects from images without paired data and evaluate it on the CelebA dataset.