본문 바로가기
  • 책상 밖 세상을 경험할 수 있는 Playground를 제공하고, 수동적 학습에서 창조의 삶으로의 전환을 위한 새로운 라이프 스타일을 제시합니다.

Natural Language Processing78

[2025-1] 김지원 - Mamba: Linear-Time Sequence Modeling with Selective State Spaces Mamba: Linear-Time Sequence Modeling with Selective State Spaces (2023)인용수: 2256 (25.02.23 기준)논문 링크 : https://arxiv.org/pdf/2312.00752https://blog.outta.ai/169 [2025-1] 김지원 - Efficiently Modeling Long Sequences with Structured State Spaces논문 링크 Efficiently Modeling Long Sequences with Structured State Spaces특징 : ICRL 2022 Outstanding Paper, 인용 수 1578회 (2025-01-25 기준)코드: https://github.com/state-.. 2025. 2. 23.
[2025-1] 김학선 - Code Security Vulnerability Repair Using Reinforcement Learning with Large Language Models https://arxiv.org/abs/2401.07031 Code Security Vulnerability Repair Using Reinforcement Learning with Large Language ModelsWith the recent advancement of Large Language Models (LLMs), generating functionally correct code has become less complicated for a wide array of developers. While using LLMs has sped up the functional development process, it poses a heavy risk to code secarxiv.orgIntroducti.. 2025. 2. 18.
[2025-1] 차승우 - Titans: Learning to Memorize at Test Time https://arxiv.org/abs/2501.00663 Titans: Learning to Memorize at Test TimeOver more than a decade there has been an extensive research effort on how to effectively utilize recurrent models and attention. While recurrent models aim to compress the data into a fixed-size memory (called hidden state), attention allows attending toarxiv.org 0. Abstract 순환 모델은 데이터를 고정된 크기의 메모리(hidden state)로 압축하는 것을 .. 2025. 2. 17.
[2025-1] 차승우 - Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling https://arxiv.org/abs/1412.3555 Empirical Evaluation of Gated Recurrent Neural Networks on Sequence ModelingIn this paper we compare different types of recurrent units in recurrent neural networks (RNNs). Especially, we focus on more sophisticated units that implement a gating mechanism, such as a long short-term memory (LSTM) unit and a recently proposed gatedarxiv.org0. Abstract- tanh RNN과 비교하.. 2025. 2. 15.