⭐ Goal

<aside>

⭐ Ground Rule

<aside>

🔥 Weekly Plan

<aside>

🔥 Paper 관계형에 연결된 논문을 확인하고, ReadingLog 관계형에 각자 할당된 논문을 정리하면 됩니다. **** (다양한 리뷰자료들을 활용해 이해해도 되지만, 최종적으로는 논문을 꼭 확인해보세요.)

</aside>

<aside>

⚠️ 스터디가 끝난 후에는, 1~2주 안에 Comments에 개인의견/시사점/내용/향후학습 등을 꼭 작성하세요 ⚠️ 정리에 활용한 리뷰자료와 강의, 튜토리얼, 코드 자료를 관계형으로 연결된 Paper·Book DB 페이지에 리스트업!!!

</aside>

Untitled

📄 Paper List

[Week 1] Word Embedding [2013][NIPS][Skip-gram] Distributed Representations of Words and Phrases and their Compositionality (47948) [2013][ICLR][word2vec] Efficient Estimation of Word Representations in Vector Space (48040) [2014][ICML][doc2vec] Distributed Representations of Sentences and Documents (13788) [2014][EMNLP][GloVe] Global Vectors for Word Representation (46391) [2017][TACL][fastText] Enriching Word Vectors with Subword Information (14044)

[Week 2] Deep Learning for Classification and Generation # CNN, RNN [2014][EMNLP][-] Convolutional Neural Networks for Sentence Classification (21096) # classification (sentence), CNN [2014][ACL][DCNN] A Convolutional Neural Network for Modelling Sentences (5191) # classification (sentence), CNN [2014][NIPS][Seq2Seq] Sequence to Sequence Learning with Neural Networks (29351) # Generation (Sequence-to-Sequence) [2014][SSST][GRU] On the Properties of Neural Machine Translation: Encoder–Decoder Approaches (10187) # Generation (Sequence-to-Sequence) [2014][NIPS][GRU] Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling (19788) # Generation (Language modeling) [2015][arXiv][LSTM-CRF] Bidirectional LSTM-CRF Models for Sequence Tagging (5910) # classification (word), RNN [2016][IJCAI][-] Recurrent Neural Network for Text Classification with Multi-Task Learning (2120) # classification (sentence), RNN

[Week 3] Attention and Transformer [2014][ICLR][-] Neural Machine Translation by Jointly Learning to Align and Translate (38547) # attention [2015][EMNLP][-] Effective Approaches to Attention-based Neural Machine Translation (11317) # attention [2016][NAACL][HAN] Hierarchical Attention Networks for Document Classification (6428) # attention, sentence embedding [2017][ICLR][-] A Structured Self-attentive Sentence Embedding (2965) # self-attention, sentence embedding [2017][NIPS][Transformer] Attention is All You Need (174046) # self-attention, translation