New Framework for Deep Mutual Learning (DML) to Improve Multi-task Recommender Systems | HackerNoon
Briefly

This article discusses the effectiveness of recommender systems using multi-task learning (MTL) methods that optimize multiple objectives simultaneously. Traditional MTL structures face limitations due to standalone task towers, which hinder knowledge sharing among related tasks. The authors propose a novel approach called Deep Mutual Learning (DML) to enhance collaboration between task towers. By introducing Cross Task Feature Mining (CTFM) and Global Knowledge Distillation (GKD), DML allows for shared insights and better integration of task relationships. This framework aims to improve prediction accuracy by ensuring that interconnected tasks leverage essential information from each other.
The architecture of standalone task towers in multi-task learning is suboptimal; a better model incorporates Cross Task Feature Mining to facilitate inter-task knowledge sharing.
By enabling tasks to share insights, such as 'like' and 'buy', the framework of Deep Mutual Learning enhances the predictive accuracy of recommender systems.
Training labels across different tasks should maintain a joint distribution, preventing predictions from converging in low-density areas, therefore optimizing the overall task performance.
Our proposed Deep Mutual Learning framework empowers multi-task networks to effectively utilize relationships among tasks, culminating in improved recommender system efficiency.
Read at Hackernoon
[
|
]