Useful Notes
Contents
-
While the group-sparsity constraint forces our model to only consider a few features, these features are largely used across all tasks. All of the previous approaches thus assume that the tasks used in multi-task learning are closely related. However, each task might not be closely related to all of the available tasks. In those cases, sharing information with an unrelated task might actually hurt performance, a phenomenon known as negative transfer.
-
rsync -avz –delete ./FederatedLearning/ @.ibex.kaust.edu.sa:~/rsync_files