Systems | Information | Learning | Optimization
 

How much is lost in pairwise correlations: the case of phylogenetics

Among the many popular techniques for reconstructing evolutionary trees from molecular sequences, so-called distance-matrix methods are typically the fastest. This speed stems from a straightforward, intuitive approach: repeated merging of the closest clusters of sequences. However, unlike more elaborate techniques such as maximum likelihood, distance-matrix methods only exploit empirical correlations …

Hardware Optimizations for Optimization | Mirror Descent for Metric Learning

Victor’s Abstract: How we used system programming techniques to tune convex optimization solvers. I demonstrate how understanding the hardware can influence the runtime of Nonnegative Matrix Factorization (NMF). ————————————- Gautam’s Abstract: Most metric learning methods are characterized by diverse loss functions and projection methods, which naturally begs the question: is …

Perturbation, Optimization and Statistics for Effective Machine Learning

Predictions in modern statistical inference problems can be increasingly understood in terms of discrete structures such as arrangements of objects in computer vision, phonemes in speech recognition, parses in natural language processing, or molecular structures in computational biology. For example, in image scene understanding one needs to jointly predict discrete …

Sparse Signal Recovery in Unions of Subspaces | Nuclear proliferation and convex relaxations: Experimental results of just-in-time research

Nikhil’s Abstract —————————— In applications ranging from communications and image processing to genetics, signals can be modeled as lying in a union of subspaces. Under this model, signal coefficients that lie in certain subspaces are active or inactive together. The potential subspaces are known in advance, but the particular set …

Chance-constrained Packing Problems | Suppressing pseudocodewords by penalizing the objective of LP decoding

Song’s abstract —————————- We consider a probabilistic version of classical 0-1 packing problem, where we have a set of items with random weights and a random knapsack capacity. The objective is to choose a set of items that maximizes profit while ensuring the probability that the knapsack constraint is satisfi ed …

Going off the grid with limited data | DWQP: A solver for large scale box constrained quadratic programs.

Gongguo ——————————– Title: Going off the grid with limited data Most of the recent activity on sparse signal recovery has focused on signals with sparse representations in finite discrete dictionaries. However, signals encountered in applications such as imaging, radar, sonar, sensor array, communication, seismology, and remote sensing are usually specified …

Local Convergence of GROUSE | Multiplicative-forest continuous-time disease prediction from Electronic Health Records

Profesor Wright’s abstract: GROUSE is an incremental algorithm for subspace identification based on incomplete information, proposed and studied by Laura Balzano, Rob Nowak, and Ben Recht at Madison. This talk discusses recent results on the local convergence behavior of GROUSE, showing an expected linear convergence rate. Stronger results are possible …

Quantum Compressed Sensing

Quantum computation and quantum information are of great current interest in computer science, mathematics, physical sciences and engineering. They will likely lead to a new wave of technological innovations in communication, computation and cryptography. As the theory of quantum physics is fundamentally stochastic, randomness and uncertainty are deeply rooted in …