분류 전체보기302 [2025-1] 주서영 - Flow matching for generative modeling Flow MatchingICLR 2023850회 인용1. Introduction본 논문은 Continuous Normalizaing Flows(CNF)를 시뮬레이션 없이(simulation-free) 효율적으로 훈련할 수 있는 학습 방법인 Flow Matching (FM)을 제시한다.2. Preliminaries : Continuous Normalizing FlowsNormalizaing Flow : 데이터 분포인 $x$에서 $z$로의 역변환이 가능한 Flow를 학습하는 모델Continuous Normalizing Flows(CNF) : 시간에 따른 vector filed를 학습하여 ODE를 통해 확률 분포를 변환하는 생성 모델$\mathbb{R}^d$데이터 포인트 $x=(x^1,\cdots,x^d)\i.. 2025. 2. 20. [25-1] 박지원 - Deep-Emotion: Facial Expression RecognitionUsing Attentional Convolutional Network Original paper ) https://arxiv.org/abs/1902.01019 Deep-Emotion: Facial Expression Recognition Using Attentional Convolutional NetworkFacial expression recognition has been an active research area over the past few decades, and it is still challenging due to the high intra-class variation. Traditional approaches for this problem rely on hand-crafted features such as SIFT, HOG and LBP, foarxiv.or.. 2025. 2. 19. [2025-1] 김학선 - Code Security Vulnerability Repair Using Reinforcement Learning with Large Language Models https://arxiv.org/abs/2401.07031 Code Security Vulnerability Repair Using Reinforcement Learning with Large Language ModelsWith the recent advancement of Large Language Models (LLMs), generating functionally correct code has become less complicated for a wide array of developers. While using LLMs has sped up the functional development process, it poses a heavy risk to code secarxiv.orgIntroducti.. 2025. 2. 18. [2025-1] 차승우 - Titans: Learning to Memorize at Test Time https://arxiv.org/abs/2501.00663 Titans: Learning to Memorize at Test TimeOver more than a decade there has been an extensive research effort on how to effectively utilize recurrent models and attention. While recurrent models aim to compress the data into a fixed-size memory (called hidden state), attention allows attending toarxiv.org 0. Abstract 순환 모델은 데이터를 고정된 크기의 메모리(hidden state)로 압축하는 것을 .. 2025. 2. 17. 이전 1 ··· 22 23 24 25 26 27 28 ··· 76 다음