<aside>
<aside>
<aside>
🔥 Paper 관계형에 연결된 논문을 확인하고, ReadingLog 관계형에 각자 할당된 논문을 정리하면 됩니다. **** (다양한 리뷰자료들을 활용해 이해해도 되지만, 최종적으로는 논문을 꼭 확인해보세요.)
</aside>
<aside>
⚠️ 스터디가 끝난 후에는, 1~2주 안에 Comments에 개인의견/시사점/내용/향후학습 등을 꼭 작성하세요
⚠️ 정리에 활용한 리뷰자료와 강의, 튜토리얼, 코드 자료를 관계형으로 연결된 Paper·Book
DB 페이지에 리스트업!!!
</aside>
[Week 1] Introduction - Overview and Survey Papers Graph Embedding [2017][DEB][-] Representation Learning on Graphs - Methods and Applications (2738) [2020][AI Open][-] Graph neural networks: A review of methods and applications (7750) # GNN designs [2021][TNNLS][-] A Comprehensive Survey on Graph Neural Networks (12466) [2023][IJCAI][-] Graph Pooling for Graph Neural Networks - Progress, Challenges, and Opportunities (124) [2024][NN][-] A Comprehensive Survey on Deep Graph Representation Learning (0) Knowledge Graph Embedding [2017][TKDE][-] Knowledge Graph Embedding - A Survey of Approaches and Applications (3079) [2021][information][-] On Training Knowledge Graph Embedding Models (7) Natural Language Processing [2021][FTML][-] Graph Neural Networks for Natural Language Processing - A Survey (412) Recommender Systems [2022][CS][-] Graph Neural Networks in Recommender Systems - A Survey (1647) [2022][TKDE][-] A Survey on Knowledge Graph-Based Recommender Systems (1161) [2023][TRS][-] A Survey of Graph Neural Networks for Recommender Systems - Challenges, Methods, and Directions (550)
<aside>
[Preliminary 1] Attention and Transformer [2014][ICLR][-] Neural Machine Translation by Jointly Learning to Align and Translate (38547) # attention [2015][EMNLP][-] Effective Approaches to Attention-based Neural Machine Translation (11317) # attention [2017][NIPS][Transformer] Attention is All You Need (174046) # self-attention, translation
</aside>
<aside>
[Preliminary 2] Self-supervised Pretraining [2018][OpenAI][GPT-1] Improving Language Understanding by Generative Pre-Training (11454) # decoder [2019][NAACL][BERT] Pre-training of Deep Bidirectional Transformers for Language Understanding (117277) # encoder, masked language modeling [2019][OpenAI][GPT-2] Language Models are Unsupervised Multitask Learners (14082) # decoder, autoregressive modeling [2020][JMLR][T5] Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer (18923) # encoder-decoder [2020][NIPS][GPT-3] Language Models are Few-Shot Learners (34810) # decoder [2023][OpenAI][GPT-4] Technical Report (5168) # decoder
</aside>
[Week 2] Simple Graph Embedding with word2vec [2014][KDD][DeepWalk] Online Learning of Social Representations (11801) # homogeneous(node embedding)
[[2016][KDD][node2vec]](<https://bkoh509.notion.site/2016-KDD-node2vec-18f7a793340b4157b878b224b2da37a7>) Scalable Feature Learning for Networks (12948)
[[2017][KDD][metapath2vec]](<https://bkoh509.notion.site/2017-KDD-metapath2vec-1cab3de2a77b80c5af08dc51e1760898>) Scalable Representation Learning for Heterogeneous Networks (2563) *# heterogeneous(node embedding, relation-specific projection)*
- **🤔 Heterogeneous Graph, Knowledge Graph, Ontology, Triple Store 이란?**
- **🤔 Knowledge Graph Embedding 차이**
- **🤔 Geometric Representation Learning 차이**
**[Week 3] Simple Knowledge Graph Embedding with Translation** *# h(300), r(300), t(300)가 전부 동일 차원의 벡터*
****[[2013][NIPS][TransE]](<https://bkoh509.notion.site/2013-NIPS-TransE-f588289a1ebe4bd6a3dbedf0434131c5>) Translating Embeddings for Modeling Multi-Relational Data (9508)
[[2014][AAAI][TransH]](<https://bkoh509.notion.site/2014-AAAI-TransH-1ceb3de2a77b8018a831da7b23470d74>) Knowledge Graph Embedding by Translating on Hyperplanes (4487)
[[2015][AAAI][TransR]](<https://bkoh509.notion.site/2015-AAAI-TransR-5e6bded8c8b14bf88330c785d8e7be48>) Learning Entity and Relation Embeddings for Knowledge Graph Completion (4501)
[[2019][ICLR][RotatE]](<https://bkoh509.notion.site/2019-ICLR-RotatE-21cb3de2a77b80eba030d442aee78332>) Knowledge Graph Embedding by Relational Rotation in Complex Space (3158)
**[Week 4] Simple Knowledge Graph Embedding with Semantic Matching** *# h(1x300), r(300x300), t(300x1)를 matrix multiplication을 통해 1D score을 계산할 수 있게 조합하는것이 목적*
[[2013][NIPS][NTN]](<https://bkoh509.notion.site/2013-NIPS-NTN-21cb3de2a77b803a8735cb8371893e25>) Reasoning With Neural Tensor Networks for Knowledge Base Completion (2614)
****[[2015][ICLR][DistMult]](<https://bkoh509.notion.site/2015-ICLR-DistMult-1ceb3de2a77b80cf99b7da3b197a2968>) Embedding Entities and Relations for Learning and Inference in Knowledge Bases (3931)
[[2016][ICML][ComplEx]](<https://bkoh509.notion.site/2016-ICML-ComplEx-21cb3de2a77b800d8511c15e64df01c7>) Complex Embeddings for Simple Link Prediction (3678)
[[2017][AAAI][ProjE]](<https://bkoh509.notion.site/2017-AAAI-ProjE-bd22a021d1664aab86efca7a96daf008>) Embedding Projection for Knowledge Graph Completion (326)
[[2018][AAAI][ConvE]](<https://bkoh509.notion.site/2018-AAAI-ConvE-e7c0316774a042e5817832fbea789121>) Convolutional 2D Knowledge Graph Embeddings (3021)
- **🤔 Spectral GCN vs Spatial GCN**
- **🤔 Transductive vs. Inductive**