Nlp BERT - Pretraining of Deep Bidirectional Transformers for Language Understanding 리뷰 원문 : Devlin, Jacob, et al. “Bert: Pre-training of deep bidirectional transformers for language understanding.” arXiv preprint arXiv:1810.04805 (2018).
Nlp Attention is All You Need 리뷰 원문 : Vaswani, Ashish, et al. “Attention is all you need.” Advances in neural information processing systems. 2017.