Multi-task learning (MTL) is a machine learning approach where a model learns multiple tasks simultaneously, leveraging the shared information between related tasks to improve generalization. MTL can be motivated by human learning and is considered a form of inductive transfer. Two common methods for MTL in deep learning are hard and soft parameter sharing. Hard parameter sharing involves sharing hidden layers across tasks, while soft parameter sharing utilizes separate models for each task with regularized parameters. MTL works through mechanisms like implicit data augmentation, attention focusing, eavesdropping, representation bias, and regularization. In addition, auxiliary tasks can help improve the performance of the main task in MTL.