본문 바로가기
  • 책상 밖 세상을 경험할 수 있는 Playground를 제공하고, 수동적 학습에서 창조의 삶으로의 전환을 위한 새로운 라이프 스타일을 제시합니다.

전체 글75

[2024-1] 박태호 - Large Language Models are Human-Level Prompt Engineers https://arxiv.org/abs/2211.01910 Large Language Models Are Human-Level Prompt Engineers By conditioning on natural language instructions, large language models (LLMs) have displayed impressive capabilities as general-purpose computers. However, task performance depends significantly on the quality of the prompt used to steer the model, and mo arxiv.org Abstract. LLM은 여러 방면으로 높은 성능을 보이지만, 모델을 조종하.. 2024. 4. 12.
[2024-1] 양소정 - Generative Adversarial Networks https://arxiv.org/abs/1406.2661 Generative Adversarial Networks We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a discriminative model D that estimates the probability that arxiv.org Abstract 적대적(adversarial) 프로세스를 통해 생성 모델을 추정하는 프레임워크 제안함 이 프레임워크는 'm.. 2024. 4. 10.
[2024-1] 백승우 - You Only Watch Once: A Unified CNN Architecturefor Real-Time Spatiotemporal Action Localization You Only Watch Once: A Unified CNN Architecture for Real-Time Spatiotemporal Action Localization Spatiotemporal action localization requires the incorporation of two sources of information into the designed architecture: (1) temporal information from the previous frames and (2) spatial information from the key frame. Current state-of-the-art approache arxiv.org 0. Abstract Spatiotemporal action .. 2024. 4. 4.
[2024-1] 김경훈 - MUNIT(Multi-Modal Unsupervised Image-to-Image translation) 원본 논문 링크 : https://arxiv.org/abs/1804.04732 Multimodal Unsupervised Image-to-Image Translation Unsupervised image-to-image translation is an important and challenging problem in computer vision. Given an image in the source domain, the goal is to learn the conditional distribution of corresponding images in the target domain, without seeing any pair arxiv.org 깃허브 https://github.com/NVlabs/MUNIT .. 2024. 3. 26.