Submodular with Continuous Optimization
发布时间:2024年06月18日
浏览次数:902
发布者: Fei Tao
主讲人: Min Cui(BICMR)
活动时间: 从 2024-06-24 15:00 到 16:00
场地: Room 29, Quan Zhai, BICMR
Abstract: This report explores the integration of submodular optimization and continuous optimization techniques. Submodular functions, which exhibit diminishing returns properties, are pivotal in various discrete optimization problems, including sensor placement, influence maximization, and machine learning. Traditional submodular optimization focuses on discrete domains, but recent advancements have extended these methods to continuous domains, opening new avenues for applications in machine learning, economics, and engineering.
The report delves into the theoretical foundations of submodular functions, highlighting their unique properties and the challenges involved in optimizing them. It reviews key algorithms and methods developed for submodular optimization, including greedy algorithms, convex relaxation, and gradient-based techniques. Emphasis is placed on the transition from discrete to continuous optimization, discussing how continuous relaxation and smooth approximations enable the application of powerful continuous optimization tools.
The report delves into the theoretical foundations of submodular functions, highlighting their unique properties and the challenges involved in optimizing them. It reviews key algorithms and methods developed for submodular optimization, including greedy algorithms, convex relaxation, and gradient-based techniques. Emphasis is placed on the transition from discrete to continuous optimization, discussing how continuous relaxation and smooth approximations enable the application of powerful continuous optimization tools.