Seminar Upcoming

The Technion Theory Lunch is a seminar run by the theory group at the Department of Computer Science, Technion. The seminar holds weekly meetings in which graduate students and faculty from the Technion and other universities in Israel and abroad present recent results or other interesting topics in theoretical computer science. The seminar is open to faculty, graduate and undergraduate students interested in theory of computing.

Icon of seminar time  Wednesdays, 13:00-14:00 (food starts at 12:45)
Icon of seminar place  Room 201 (note return to traditional room), Taub Building

If you are interested in giving a talk at the seminar, contact Omri Ben-Eliezer and/or David Wajc.

The seminar’s mailing list is used to announce our weekly meetings, and also to publicize CS theory related events, such as workshops and seminars, in Israel and abroad.

To join or leave our mailing list.

On Reductions of Learning Problems and the Borsuk-Ulam theorem
Show and Hide Ext. Content for On Reductions of Learning Problems and the Borsuk-Ulam theorem
Wed, May 7
Tom Waknine (Technion)

Many practical prediction algorithms represent inputs in Euclidean space and replace the discrete 0/1 classification loss with a real-valued surrogate loss, effectively reducing classification tasks to stochastic optimization. In this talk, I will explore the expressiveness of such reductions in terms of key resources. I will formally define a general notion of reductions between learning problems, and realte it some well known examples such as representations by half-spaces.

I will then establish bounds on the minimum Euclidean dimension D required to reduce a concept class with VC dimension d to a Stochastic Convex Optimization (SCO) problem in a D dimensional Euclidean space. This result provides a formal framework for understanding the intuitive interpretation of the VC dimension as the number of parameters needed to learn a given class.

The proof leverages a clever application of the Borsuk-Ulam theorem, illustrating an intersting connection between topology and learning theory

Deterministic Algorithms and Faster Algorithms for Submodular Maximization subject to a Matroid Constraint
Show and Hide Ext. Content for Deterministic Algorithms and Faster Algorithms for Submodular Maximization subject to a Matroid Constraint
Wed, May 14
Niv Buchbinder (Tel-Aviv University)

Maximization of submodular functions under various constraints is a fundamental problem that has been studied extensively.

In this talk I will discuss submodular functions and interesting research questions in the field.

I will present several new techniques that lead to both deterministic algorithms and faster randomized algorithms for maximizing submodular functions.

In particular, for monotone submodular functions subject to a matroid constraint we design a deterministic non-oblivious local search algorithm that has an approximation guarantee of 1 – 1/e – \eps (for any \eps > 0), vastly improving over the previous state-of-the-art 0.5008-approximation.

For general (non-monotone) submodular functions we introduce a new tool, that we refer to as the extended multilinear extension, designed to derandomize submodular maximization algorithms that are based on the successful “solve fractionally and then round” approach.

Yaroslav Alekseev (Technion)
Show and Hide Ext. Content for Yaroslav Alekseev (Technion)
Wed, May 21
Yaroslav Alekseev (Technion)
Noa Schiller (Tel-Aviv University)
Show and Hide Ext. Content for Noa Schiller (Tel-Aviv University)
Wed, May 28
Noa Schiller (Tel-Aviv University)
Tomer Even (Technion)
Show and Hide Ext. Content for Tomer Even (Technion)
Wed, June 11
Tomer Even (Technion)
Amit Levi (U. Haifa)
Show and Hide Ext. Content for Amit Levi (U. Haifa)
Wed, June 18
Amit Levi (University of Haifa)