Today is September 12, 2024, the group will be touching the fish, it's been a long time since I've read a paper, I've been touching the fish and looking at the code, recently IJCAI 2024 came out, and I've been looking for a few papers to read, first of all here's the first one.
Thesis:Denoising-Aware Contrastive Learning for Noisy Time Series
Or:Denoising-Aware Contrastive Learning for Noisy Time Series
GitHub:/betterzhou/DECL
Papers from IJCAI 2024.
Touchy-feely jerks, catching up on their papers.
(herald,Read it in the original English,But writing a blog is directly translated by a whole paragraph translator,Will do some minor fixing where the translation doesn't fit.,Mostly unaltered,So it's the translator's fault if there's a problem with the translation.,Ha ha ha ha ha ha ha ha ha ha ha ha.)
summaries
Self-supervised learning (SSL) of time series aims to utilize unlabeled data for pre-training in order to alleviate the reliance on labels. Despite the great success in recent years, there has been limited discussion on the potential noise in time series, which can severely affect the performance of existing SSL methods. To mitigate noise, the de facto strategy is to apply traditional denoising methods before model training. However, such preprocessing methods may not be able to completely eliminate the effect of noise in SSL for two reasons: (i) there are various types of noise in the time series, and it is difficult to automatically determine a suitable denoising method; and (ii) the noise may be amplified after mapping the original data into the latent space. In this paper, we propose Denoising-aware Contrastive Learning (DECL), which uses a contrastive learning objective to mitigate noise in representations and automatically selects the appropriate denoising method for each sample. Extensive experiments on various datasets validate the effectiveness of our approach. The code is open-sourced.
(It's 21:42 and ready to slip away for tomorrow...)