I’m pretty excited about Afonso‘s latest research developments (namely, this and that), and I’ve been thinking with Jesse Peterson about various extensions, but we first wanted to sort out the basics of linear and semidefinite programming. Jesse typed up some notes and I’m posting them here for easy reference:
Many optimization problems can be viewed as a special case of cone programming. Given a closed convex cone in , we define the dual cone as
Examples: The positive orthant and the positive semidefinite cone are both self-dual, and the dual of a subspace is its orthogonal complement. Throughout we assume we have , , closed convex cone , closed convex cone , and linear operator . We then have the primal and dual programs
Notice when and are both the positive orthant, this is a standard linear program. Further, considering the space of real symmetric matrices as , when and is the positive semidefinite cone, this is a standard semidefinite program. We consider several standard results (weak duality, strong duality, complementary slackness) in terms of the general cone program.
Continue reading Cone Programming Cheat Sheet
I’ve been on the job market full-time for the last 6 weeks or so, and I’ve finally settled on my destination: Starting this August, I’ll be a tenure-track assistant professor at The Ohio State University. My wife and I are very excited to move to Columbus!
I wanted to document my process for applying, interviewing and negotiating. I’ll probably refer to this blog post later when I give advice to a future PhD student or postdoc.
1. More offers make a better selection
My goal was to get the most attractive offer possible. Of course, different people have different notions of attractiveness, but there’s still an objective function to optimize. The main point is that you will be more satisfied if there are more options on the table. Not only are you maximizing over a larger set, the offers will compete with each other, and so you can auction for better offers. If you have more than two offers, it might be a little confusing how to maintain the auction — more on that later.
Continue reading Interviews and offers
Readers of this blog are probably already aware that Alexander Grothendieck died on Thursday. He is widely regarded as one of the most influential mathematicians in the twentieth century. Since his is not my field of study, I felt that now was a good time to learn a little about why he is so well regarded — I took the day to read a couple of articles from 10 years ago that provide an overview of his life, research, personality, and philosophies. I highly recommend the read: here and here.
Continue reading Alexander Grothendieck
I have a recent paper on the arXiv with Afonso Bandeira and Joel Moreira that provides a deterministic RIP matrix which breaks the square-root bottleneck, conditional on a folklore conjecture in number theory.
Here’s the construction (essentially): Let be a prime which is 1 mod 4. Consider the DFT matrix, and grab rows corresponding to the quadratic residues (i.e., perfect squares) modulo . This construction was initially suggested in this paper. We already know that partial Fourier matrices break the square-root bottleneck if the rows are drawn at random, so our result corresponds to the intuition that quadratic residues exhibit some notion of pseudorandomness.
There are actually two folklore conjectures at play in our paper. Conjecture A implies that this matrix breaks the square-root bottleneck, which in turn implies Conjecture B:
Continue reading A conditional construction of restricted isometries
A couple of weeks ago, I attended the “Sparse Representations, Numerical Linear Algebra, and Optimization Workshop.” It was my first time at Banff, and I was thoroughly impressed by the weather, the facility, and the workshop organization. A few of the talks were recorded and are available here. Check out this good-looking group of participants:
I wanted to briefly outline some of the problems that were identified throughout the workshop.
Continue reading Sparse Representations, Numerical Linear Algebra, and Optimization Workshop