Submodular with Continuous Optimization
Time: 2024-06-18
Published By: Fei Tao
Speaker(s): Min Cui(BICMR)
Time: 15:00-16:00 June 24, 2024
Venue: Room 29, Quan Zhai, BICMR
Abstract: This report explores the integration of submodular optimization and continuous optimization techniques. Submodular functions, which exhibit diminishing returns properties, are pivotal in various discrete optimization problems, including sensor placement, influence maximization, and machine learning. Traditional submodular optimization focuses on discrete domains, but recent advancements have extended these methods to continuous domains, opening new avenues for applications in machine learning, economics, and engineering.
The report delves into the theoretical foundations of submodular functions, highlighting their unique properties and the challenges involved in optimizing them. It reviews key algorithms and methods developed for submodular optimization, including greedy algorithms, convex relaxation, and gradient-based techniques. Emphasis is placed on the transition from discrete to continuous optimization, discussing how continuous relaxation and smooth approximations enable the application of powerful continuous optimization tools.
The report delves into the theoretical foundations of submodular functions, highlighting their unique properties and the challenges involved in optimizing them. It reviews key algorithms and methods developed for submodular optimization, including greedy algorithms, convex relaxation, and gradient-based techniques. Emphasis is placed on the transition from discrete to continuous optimization, discussing how continuous relaxation and smooth approximations enable the application of powerful continuous optimization tools.