본문 바로가기

AI17

[NIPS-2020] Denoising Diffusin Probabilistic Models 논문 전문 : https://papers.nips.cc/paper_files/paper/2014/hash/5ca3e9b122f61f8f06494c97b1afccf3-Abstract.html [출처] https://doi.org/10.48550/arXiv.2006.11239 ※ The picture and content of this article are from the original paper. This article is more of an intuitive understanding than academic analysis. [논문 요약] Denoising Diffusion Probablistic Models 2020년 이후 나온 논문들 중 가장 핫한 AI 모델을 꼽으라면 당연코 Diffusion M.. 2024. 1. 29.
[NIPS-2014] Generative Adversarial Nets 논문 전문 : https://papers.nips.cc/paper_files/paper/2014/hash/5ca3e9b122f61f8f06494c97b1afccf3-Abstract.html [출처] https://doi.org/10.48550/arXiv.1406.2661 ※ The picture and content of this article are from the original paper. This article is more of an intuitive understanding than academic analysis. [논문 요약] Generative Adversarial Nets Citation 68,000이 넘는 슈퍼 논문입니다. 저자들만봐도 모두 한명한명 현재 대가라고 불리는 사람들입니다... 2024. 1. 25.
[KDD-2019] Time-Series Anomaly Detection Service at Microsoft 논문 전문 : https://arxiv.org/abs/1906.03821 [출처] KDD '19: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data MiningJuly 2019Pages 3009–3017https://doi.org/10.1145/3292500.3330680 ※ The picture and content of this article are from the original paper. This article is more of an intuitive understanding than academic analysis. [논문 요약] Time-Series Anomaly Detection.. 2024. 1. 22.
[AAAI-2019] TabNet : Attentive Interpetable Tabular Learning 논문 전문 : https://arxiv.org/abs/1908.07442 [출처] https://doi.org/10.1609/aaai.v35i8.16826 ※ The picture and content of this article are from the original paper. This article is more of an intuitive understanding than academic analysis. [논문 요약] TabNet : Attentive Interpretable Tabular Learning Tabular Data를 다루는 분들에게는 꽤나 유명한(Citation 800이상) 구글에서 나온 TabNet 논문입니다. 당시에는 Tab Data 기준으로는 SOTA 성능이였습니다만, Tab.. 2024. 1. 18.
[ICML-2021] Tabular Data:Deep Learning is Not All You Need 논문 전문 : https://arxiv.org/abs/2106.03253 [출처] Ravid Shwartz-Ziv, Amitai Armon,Tabular data: Deep learning is not all you need,Information Fusion,Volume 81,2022,Pages 84-90,ISSN 1566-2535,https://doi.org/10.1016/j.inffus.2021.11.011. ※ The picture and content of this article are from the original paper. This article is more of an intuitive understanding than academic analysis. [논문 요약] Tabular Dat.. 2024. 1. 11.