Any task that requires selecting multiple, non-similar items leverages the concept of negative dependence. Concepts such as negatively-dependent measures and submodularity are powerful, theoretically grounded tools that can aid in this selection. Determinantal point processes are arguably the most popular negatively-dependent measure in machine learning, with applications to recommender systems, neural network pruning, ensemble learning, summarization, Monte Carlo integration, and kernel reconstruction, among many others.
However, the spectrum of negatively dependent measures is much broader, and spans a wide range of theory and applications to fundamental problems in machine learning: whether selecting training data, finding an optimal experimental design, exploring in reinforcement learning, or making suggestions with recommender systems, selecting a high-quality but diverse set of items is a core challenge.
This workshop will discuss with the ICML audience the rich mathematical tools associated with negative dependence and submodularity, delving into the key theoretical concepts that underlie negatively-dependent measures and submodularity, and investigating fundamental applications.
Call for papers
We invite submissions of papers on any topic related to negative dependence and submodularity in machine learning, including (but not limited to):
- Submodular optimization
- Determinantal point processes
- Volume sampling
- Recommender systems
- Experimental design
- Variance-reduction methods
- Exploitation/exploration trade-offs (reinforcement, Bayesian Optimization, etc.)
- Batched active learning
- Strongly Rayleigh measures
- Monte Carlo integration
- Biological sequence design
- Log-concave polynomials
- Randomized numerical linear algebra