Our paper on Learning Compound Tasks is accepted to ICML 2020
페이지 정보
작성자 최고관리자 댓글 조회 작성일 21-10-07 20:24본문
The following paper is accepted to the International Conference on Machine Learning (ICML 2020):
Learning Compound Tasks without Task-specific Knowledge via Imitation and Self-supervised Learning
by Sang-Hyun Lee, Seung-Woo Seo
Abstract
Most real-world tasks are compound tasks that consist of multiple simpler sub-tasks. The main challenge of learning compound tasks is that we have no explicit supervision to learn the hierarchical structure of compound tasks. To address this challenge, previous imitation learning methods exploit task-specific knowledge, e.g., labeling demonstrations manually or specifying termination conditions for each sub-task. However, the need for task-specific knowledge makes it difficult to scale imitation learning to real-world tasks. In this paper, we propose an imitation learning method that can learn compound tasks without task-specific knowledge. The key idea behind our method is to leverage a self-supervised learning framework to learn the hierarchical structure of compound tasks. Our work also proposes a task-agnostic regularization technique to prevent unstable switching between sub-tasks, which has been a common degenerate case in previous works. We evaluate our method against several baselines on compound tasks. The results show that our method achieves state-of-the-art performance on compound tasks, outperforming prior imitation learning methods.
댓글목록
등록된 댓글이 없습니다.