Caitlin Lienkaemper

I am a fifth year graduate student in mathematics at Penn State.
I am a student of Carina Curto.
I am primarily interested in geometry, combinatorics, and dynamical systems,
especially as applied to problems in mathematical biology and mathematical
neuroscience.
Email: cul434@psu.edu
Office: 402 McAllister Building
Papers and Preprints
- "Order-forcing in Neural Codes." With Amzi Jeffs and Nora Youngs (2020) arXiv.
- "Oriented Matroids and Combinatorial Neural Codes." With Alex Kunin and Zvi Rosen (2020) arXiv.
- "The geometry of partial fitness orders and an efficient method for detecting genetic interactions." With Lisa Lamberti, Dawn Drain, Niko Beerenwinkel, and Alex Gavryushkin. Journal of Mathematical Biology (2018). bioRxiv.
- "Obstructions to convexity in neural codes." With Anne Shiu and Zev Woodstock. Advances in Applied Mathematics (2017). arXiv.
Research
Convex Neural Codes

Combinatorial neural codes describe the joint activity
of a collection of neurons in terms of which neurons fire together
and which do not. Convex neural codes model the activity of
neurons with convex receptive fields, such as
place cells
in the hippocamps. Characterizing convex neural codes is mathematically
difficult, and work in this area is ongoing.
In [4], Anne Shiu, Zev Woodstock, and I provided the first example of a
non-convex code which did not have topological local obstrutions.
In [2], Alex Kunin, Zvi Rosen, and I connect the theory of convex neural
codes to the theory of oriented matroids. Using this connection, we show
that it is computationally difficult to check whether a code is convex. A
recording of a talk I gave based on this paper is available
here .
In [1], Amzi Jeffs, Nora Youngs, and I introduce the idea of order-forcing,
and use it to construct some new, very simple examples of non-convex codes.
Underlying Rank

Scientists frequently aim to measure the intrinsic dimensionality of a
dataset. In many cases, this is trivial: we just compute the rank of the
matrix containing our data. However, in many cases, we do not have access
to accurate measurements of the quantity we are really interested in, and
instead, have access to some proxy measurement which has a monotone,
nonlinear relationship with it.
In upcoming work, Carina Curto, Juliana Londono Alvarez, Hannah
Rocio Santa Cruz and I define the underlying rank of a matrix A, which is
the minimum rank r such that there is a rank r matrix B whose entries are
in the same order as A. By associating matrices to point configurations,
we are able to use results about random polytopes, oriented matroids, and
allowable sequences to estimate underlying rank. A recording of a talk
I've given on this work is available
here.
Threshold Linear Networks

Threshold linear networks (TLNs) are a model for networks of neurons which
are simple enough to be mathematically tractable, but complex enough to
display the key features of nonlinear dynamics needed to replicate real
neural networks, such as multistability, limit cycles, and chaos.
Combinatorial threshold linear networks (CTLNs) are a subclass of TLN
whose behavior is determined by a directed graph.
In upcoming work with Carina Curto and Katie Morrison, I relate the
structure of the network to its dynamics. For instance, we show that if
the underlying graph of a CTLN is a directed acyclic graph, then all
trajectories of the dynamical system approach a stable fixed point.