About
home
서재
home

논문 리뷰

“내가 더 멀리 보았다면 이는 거인들의 어깨 위에서 있었기 때문이다.” - 아이작 뉴턴
Search
키워드
발표년도
Conference
제목
Full 제목
한줄 요약
저자
단체
중요도
LLM
Tabular Task
2023
ICLR
LANGUAGE MODELS ARE REALISTIC TABULAR DATA GENERATORS
Vadim Borisov
University of Tubingen, T ¨ ubingen, Germany
⭐️⭐️⭐️⭐️⭐️
LLM
Prompt Engineering
2023
ICLR
SELF-CONSISTENCY IMPROVES CHAIN OF THOUGHT REASONING IN LANGUAGE MODELS
CoT를 쓰는데 여러 의견을 들어보고 가장 많은 대답을 선택하면 성능이 더 좋아지더라
Xuezhi Wang
Google research
⭐️⭐️⭐️⭐️⭐️
LLM
Tabular Task
2022
NeurIPS
LIFT: Language-Interfaced Fine-Tuning for Non-Language Machine Learning Tasks
Tuan Dinh
University of Wisconsin-Madison, USA
⭐️⭐️⭐️⭐️⭐️
LLM
Prompt Engineering
2022
NeurIPS
Chain-of-Thought Prompting Elicits Reasoning in Large Language Models
문제 푸는 방법을 알려주는 Prompting 방식으로 few shot을 주면 성능이 오른다. 특히 큰 LLM에서 효과적이다.
Jason Wei
Google research
⭐️⭐️⭐️⭐️⭐️
LLM
RLHF
2022
NeurIPS
Training language models to follow instructions with human feedback
GPT를 human preference data로 finetuning 하는 방법
⭐️⭐️⭐️⭐️⭐️
LLM
PEFT
Promp Tuning
2021
IJCNLP
Prefix-Tuning: Optimizing Continuous Prompts for Generation
x 앞에 continuous vector(prefix)를 붙여서 downstream task로 finetuning한 다음 학습된 prefix를 항상 x 앞에 붙여서 inference 하는 PEFT.
Xiang Lisa Li
Stanford University
⭐️⭐️⭐️⭐️⭐️
LLM
PEFT
2021
ICLR
LORA: LOW-RANK ADAPTATION OF LARGE LANGUAGE MODELS
self attention 가중치들에 병렬로 아주 적은 가중치만 붙여서 얘네만 finetuning하면 그냥 finetuning하는거만큼 좋은 성능을 내더라.
Edward Hu
Microsoft
⭐️⭐️⭐️⭐️⭐️
LLM
RAG
2020
NeurIPS
Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks
RAG의 원조. retrieval model인 DPR와 generator인 BART를 묶어서 end to end로 x,y를 학습
Patrick Lewis
FAIR
⭐️⭐️⭐️⭐️⭐️
LLM
2020
NeurIPS
Language Models are Few-Shot Learners
LLM의 Few shot Learning을 잘 사용하면 거의 모든 task를 잘할 수 있음
OpenAI
⭐️⭐️⭐️⭐️
LLM
2019
NAACL
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
양쪽 방향으로 학습하는 Transformer
⭐️⭐️⭐️⭐️
LLM
2018
Improving Language Understanding by Generative Pre-Training
LLM의 pretraining의 중요성을 강조
OpenAI
⭐️⭐️⭐️⭐️
LLM
2017
NeurIPS
Attention Is All You Need
Language Model 판도를 바꾼 모델. Self Attention의 강력한 성능을 증명.
Ashish Vaswani
Google Brain
⭐️⭐️⭐️⭐️⭐️
Anomaly Detection
2022
CVPR
Self-Supervised Predictive Convolutional Attentive Block for Anomaly Detection
Masked Convolution으로 Reconstruction을 학습하는 방식의 Anomaly Detection
Nicolae-Cat˘ alin Ristea
University Politehnica of Bucharest, Romania
Anomaly Detection
2022
WACV
CFLOW-AD: Real-Time Unsupervised Anomaly Detection with Localization via Conditional Normalizing Flows
Normalizing Flow에 Positional Encoding을 추가한 Anomaly Detection
Denis Gudovskiy
Panasonic AI Lab, USA
Anomaly Detection
2022
WACV
Fully Convolutional Cross-Scale-Flows for Image-based Defect Detection
Multi Scale Normalizing Flow 방식의 Anomaly Detection
Marco Rudolph
Leibniz University Hannover, Germany
Anomaly Detection
2022
CVPR
Anomaly Detection via Reverse Distillation from One-Class Embedding
Reverse Distillation 방식의 Anomaly Detection
H Deng
Department of Electrical and Computer Engineering, University of Alberta
Anomaly Detection
2022
CVPR
Towards Total Recall in Industrial Anomaly Detection
Pretrained Patch Feature에 kNN을 적용한 Anomaly Detection
Karsten Roth
Amazon AWS
Anomaly Detection
2021
ICLR
SSD: A UNIFIED FRAMEWORK FOR SELFSUPERVISED OUTLIER DETECTION
Contrastive Learning으로 학습한 Feature에 Clustering과 Mahalanobis Distance를 적용한 Anomaly Detection
Vikash Sehwag
Princeton University
Anomaly Detection
2021
ICLR
EXPLAINABLE DEEP ONE-CLASS CLASSIFICATION
Inversible Convolution을 사용하여 Explainability를 강조한 Anomaly Detection
Philipp Liznerski
ML group, Technical University of Kaiserslautern, Germany
Anomaly Detection
2021
Arxiv
FastFlow: Unsupervised Anomaly Detection and Localization via 2D Normalizing Flows
Normalizing Flow의 b,s 부분을 2d Convolution으로 대체한 Anomaly Detection
Jiawei Yu
SenseTime Research
Anomaly Detection
2021
WACV
Same Same But DifferNet: Semi-Supervised Defect Detection with Normalizing Flows
최초 Normalizing Flow 방식의 Anomaly Detection
Marco Rudolph
Leibniz University Hanover
Anomaly Detection
2021
ICCV
Divide-and-Assemble: Learning Block-wise Memory for Unsupervised Anomaly Detection
작은 조각으로 나누어 Anomaly Detection을 수행한 뒤 다시 합치는 방식
Jinlei Hou
Hikvision Research Institute
Anomaly Detection
2021
ICCV
A Hierarchical Transformation-Discriminating Generative Model for Few Shot Anomaly Detection
Transformed Patch Feature를 사용한 Self Supervised Learning 방식의 Anomaly Detection
Shelly Sheynin
Facebook AI Research
Anomaly Detection
2021
ICLR
LEARNING AND EVALUATING REPRESENTATIONS FOR DEEP ONE-CLASS CLASSIFICATION
Self Supervised Learning과 One Class Classification 방식을 결합한 Anomaly Detection
Kihyuk Sohn
Google Cloud AI
Anomaly Detection
2021
ICCV
Learning Unsupervised Metaformer for Anomaly detection
General Image Reconstruction을 학습한 모델을 활용하는 Anomaly Detection
Jhih-Ciang Wu
Institute of Information Science, Academia Sinica, Taiwan
Anomaly Detection
2021
Arxiv
Student-Teacher Feature Pyramid Matching for Unsupervised Anomaly Detection
Pyramid Knowledge Distillation 방식의 Anomaly Detection
Guodong Wang
State Key Laboratory of Software Development Environment Beihang University Beijing, China
Anomaly Detection
2021
CVPR
Multiresolution Knowledge Distillation for Anomaly Detection
Multi Resolution Feature를 활용하는 Knowledge Distillation 방식의 Anomaly Detection
Mohammadreza Salehi
Department of Computer Engineering, Sharif University of Technology
Anomaly Detection
2021
ICCV
DRÆM – A discriminatively trained reconstruction embedding for surface anomaly detection
합성해낸 Anomaly를 사용하여 Generator&Discriminator를 학습하는 방식의 Anomaly Detection
Vitjan Zavrtanik
University of Ljubljana, Faculty of Computer and Information Science
Anomaly Detection
2021
CVPR
CutPaste: Self-Supervised Learning for Anomaly Detection and Localization
합성해낸 Anomaly를 구분하도록 모델을 학습하는 Anomaly Detection
Chun-Liang Li
Google Cloud AI Research
Anomaly Detection
2021
ICPR
PaDiM: a Patch Distribution Modeling Framework for Anomaly Detection and Localization
Mahalonobis AD를 Patch 단위로 확장한 Anomaly Detection
Thomas Defard
Universite Paris-Saclay, CEA, List ´ , F-91120, Palaiseau, France
Anomaly Detection
2020
ICLR
DEEP SEMI-SUPERVISED ANOMALY DETECTION
Semi Supervised 방식의 Anomaly Detection
Lukas Ruff
Technical University of Berlin, Germany
Anomaly Detection
2020
ECCV
Attention Guided Anomaly Localization in Images
Attention을 적용한 Anomaly Detection
Shashanka Venkataramanan
Center for Research in Computer Vision, University of Central Florida, Orlando, FL
Anomaly Detection
2020
ICLR
CLASSIFICATION-BASED ANOMALY DETECTION FOR GENERAL DATA
다양한 Transformation에 대한 Self Supervised Learning을 활용한 Anomaly Detection
Liron Bergman
School of Computer Science and Engineering The Hebrew University of Jerusalem, Israel
Anomaly Detection
2020
NeurIPS
Why Normalizing Flows Fail to Detect Out-of-Distribution Data
Normalizing Flows를 사용한 OOD Detection 방법 제안
Polina Kirichenko
New York University
Anomaly Detection
2020
CVPR
Uninformed Students: Student–Teacher Anomaly Detection with Discriminative Latent Embeddings
Pretrained Feature를 Knowledge Distillation 방식으로 Fine Tuning 하는 Anomaly Detection
Paul Bergmann
MVTec Software GmbH
Anomaly Detection
2020
NeurIPS
CSI: Novelty Detection via Contrastive Learning on Distributionally Shifted Instances
SimCLR 방식에서 Task에 맞게 밀어낼 Feature와 당길 Feature를 조절하는 방식의 Anomaly Detection
Jihoon Tack
KAIST
Anomaly Detection
2020
ICPR
Modeling the Distribution of Normal Data in Pre-Trained Deep Features for Anomaly Detection
Pretrained Feature에 Mahalonobis Distance를 사용한 Anomaly Detection
Oliver Rippel
Institute of Imaging & Computer Vision RWTH Aachen University Aachen, Germany
Anomaly Detection
2020
Arxiv
Sub-Image Anomaly Detection with Deep Pyramid Correspondences
Pretrained Feature의 kNN을 사용한 Anomaly Detection
Niv Cohen and Yedid Hoshen
School of Computer Science and Engineering The Hebrew University of Jerusalem, Israel.
Anomaly Detection
2020
ACCV
Patch SVDD: Patch-level SVDD for Anomaly Detection and Segmentation
Deep SVDD를 Patch Level로 확장한 Anomaly Detection
Jihun Yi
SNU
Anomaly Detection
2019
NeurIPS
Using Self-Supervised Learning Can Improve Model Robustness and Uncertainty
Self Supervised Learning으로 학습한 Feature에 Softmax Score를 적용한 Anomaly Detection
Dan Hendrycks
UC Berkeley
Anomaly Detection
2018
NeurIPS
Deep Anomaly Detection Using Geometric Transformations
회전 각도를 맞추는 Self Supervised Learning을 활용한 Anomaly Detection
Izhak Golan
Department of Computer Science Technion – Israel Institute of Technology Haifa, Israel
Anomaly Detection
2018
ICML
Deep One-Class Classification
Normal Image Feature가 한 점으로 모이도록 학습하는 방식의 Anomaly Detection
Lukas Ruff
Hasso Plattner Institute, Potsdam, Germany
Anomaly Detection
2018
ACCV
GANomaly: Semi-Supervised Anomaly Detection via Adversarial Training
1Stage GAN 방식의 Anomaly Detection
Samet Akcay
Durham University, UK
Anomaly Detection
2017
IPMI
Unsupervised Anomaly Detection with Generative Adversarial Networks to Guide Marker Discovery
최초로 GAN 방식을 적용한 2Stage Anomaly Detection
Thomas Schlegl
Computational Imaging Research Lab, Department of Biomedical Imaging and Image-guided Therapy, Medical University Vienna, Austria
Object Detection
2020
Arxiv
YOLOv4: Optimal Speed and Accuracy of Object Detection
YOLO 모델의 개선 버전
Alexey Bochkovskiy
Institute of Information Science Academia Sinica, Taiwan
Object Detection
2018
Arxiv
YOLOv3: An Incremental Improvement
YOLO 모델의 개선 버전
Joseph Redmon
University of Washington
Object Detection
2017
ICCV
Focal Loss for Dense Object Detection
문제의 난이도별로 Loss 가중치를 변경하여 Object Detection의 Class Imbalance 문제를 완화
Tsung-Yi Lin
Facebook AI Research (FAIR)
Object Detection
2017
CVPR
YOLO9000: Better, Faster, Stronger
Hierarchical Classification을 활용하여 9000개 Class를 Object Detection
Joseph Redmon
University of Washington
Object Detection
2016
CVPR
YOLO : You Only Look Once: Unified, Real-Time Object Detection
추출한 Feature Map으로부터 Class와 Box 정보를 한번에 처리하는 1Stage Object Detection
Joseph Redmon
Facebook AI Research
Diffusion Model
Fine Tuning
2023
ICCV
Adding Conditional Control to Text-to-Image Diffusion Models
Adapter 처럼 병렬 학습 파라미터로 Diffusion Model FineTuning
Lvmin Zhang
Stanford University
⭐️⭐️⭐️⭐️⭐️
Load more