The Technion Theory Lunch is a seminar run by the theory group at the Department of Computer Science, Technion. The seminar holds weekly meetings in which graduate students and faculty from the Technion and other universities in Israel and abroad present recent results or other interesting topics in theoretical computer science. The seminar is open to faculty, graduate and undergraduate students interested in theory of computing.
Wednesdays, 13:00-14:00 (food starts at 12:45)
Room 201 (note return to traditional room), Taub Building
If you are interested in giving a talk at the seminar, contact Omri Ben-Eliezer and/or David Wajc.
The seminar’s mailing list is used to announce our weekly meetings, and also to publicize CS theory related events, such as workshops and seminars, in Israel and abroad.
To join or leave our mailing list.
Many practical prediction algorithms represent inputs in Euclidean space and replace the discrete 0/1 classification loss with a real-valued surrogate loss, effectively reducing classification tasks to stochastic optimization. In this talk, I will explore the expressiveness of such reductions in terms of key resources. I will formally define a general notion of reductions between learning problems, and realte it some well known examples such as representations by half-spaces.
I will then establish bounds on the minimum Euclidean dimension D required to reduce a concept class with VC dimension d to a Stochastic Convex Optimization (SCO) problem in a D dimensional Euclidean space. This result provides a formal framework for understanding the intuitive interpretation of the VC dimension as the number of parameters needed to learn a given class.
The proof leverages a clever application of the Borsuk-Ulam theorem, illustrating an intersting connection between topology and learning theory
Niv Buchbinder (Tel-Aviv University)
Maximization of submodular functions under various constraints is a fundamental problem that has been studied extensively.
In this talk I will discuss submodular functions and interesting research questions in the field.
I will present several new techniques that lead to both deterministic algorithms and faster randomized algorithms for maximizing submodular functions.
In particular, for monotone submodular functions subject to a matroid constraint we design a deterministic non-oblivious local search algorithm that has an approximation guarantee of 1 – 1/e – \eps (for any \eps > 0), vastly improving over the previous state-of-the-art 0.5008-approximation.
For general (non-monotone) submodular functions we introduce a new tool, that we refer to as the extended multilinear extension, designed to derandomize submodular maximization algorithms that are based on the successful “solve fractionally and then round” approach.
Yaroslav Alekseev (Technion)
Noa Schiller (Tel-Aviv University)
Amit Levi (University of Haifa)