- 논문 리스트 정리
- Attention Is All You Need
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
- Improving language understanding by generative pre-training
- https://d4mucfpksywv.cloudfront.net/better-language-models/language_models_are_unsupervised_multitask_learners.pdf
- Language Models are Few-Shot Learners
- Exploiting Cloze Questions for Few Shot Text Classification and Natural Language Inference
- It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners
- GPT Understands, Too
'인공지능 공부 > NLP 연구' 카테고리의 다른 글
(NLP) 2차 Prompt Learning 발표 (2022-05-04) (0) | 2022.05.04 |
---|---|
(NLP) Prompt Learning 발표 (2022-04-27) (0) | 2022.04.27 |
(NLP 연구) The Long-Document Transformer 04.06 발표 (0) | 2022.04.06 |
(NLP 연구) The Long-Document Transformer 04.01 (데이터셋 LSH 코딩) (0) | 2022.04.01 |
(NLP 연구) The Long-Document Transformer 03.31 (LSH) (0) | 2022.04.01 |