generation
-
[Story Generation Study 보충학습 : Story Generation Survey Paper] Automatic Story Generation: A Survey of Approaches 리뷰AI/NLP 2022. 7. 5. 12:47
[Story Generation Study 보충학습 : Story Generation Survey Paper] Automatic Story Generation: A Survey of Approaches 리뷰 [목차] 1. Definitions 2. Models 2-1.Structural Models 2-2.Planning based Models 2-3.ML Models 3. KNOWLEDGE SOURCES FOR STORYTELLING 4. TOWARD INTERESTING STORIES 5. STORY EVALUATION 6. DISCUSSION (Limitations) 7. Future Works 어떤 새로운 연구 분야를 접할 때, Survey 논문을 먼저 읽어보는 것을 좋아해서 이번에도 읽게 되었다..
-
[Story Generation Study Week 02 : Story Generation & Story Completion] Event Representations for Automated Story Generation with Deep Neural Nets (AAAI, 2018) ReviewAI/NLP 2022. 6. 29. 16:16
[Story Generation Study Week 02 : Story Generation & Story Completion] Event Representations for Automated Story Generation with Deep Neural Nets (AAAI, 2018) Review [Story Generation Study Week 02 : Story Generation & Story Completion] [commonsense] A Corpus and Cloze Evaluation for Deeper Understanding of Commonsense Stories (NAACL, 2016) [event] Event Representations for Automated Story Gener..
-
[Story Generation Study Week 03 : Story Generation & Story Completion] Story Realization: Expanding Plot Events into Sentences (AAAI, 2020) ReviewAI/NLP 2022. 6. 29. 16:15
[Story Generation Study Week 03 : Story Generation & Story Completion] Story Realization: Expanding Plot Events into Sentences (AAAI, 2020) Review Summary [2주차 논문 Remind] 0. Event Representation : 각 문장을 Event의 형식으로 표현한 것 1. event2event : 하위의 여러 event represnetation 생성 2. event2sentence : 추상적인 event → 인간이 읽을 수 있는 자연스러운 문장으로 translate 원본 : https://asidefine.tistory.com/195 Event Representation은 St..
-
[Story Generation Study Week 01 : Fundamental of Text Generation] GPT-1 / GPT-2 Review & 스터디 메모AI/NLP 2022. 6. 29. 13:11
[Story Generation Study Week 01 : Fundamental of Text Generation] GPT-1 / GPT-2 Review & 스터디 메모 [Story Generation Study Week 01 : Fundamental of Text Generation] GPT-1: Improving Language Understanding by Generative Pre-Training (2018) GPT-2: Language Models are Unsupervised Multitask Learners (2019) GPT-3: Language Models are Few-Shot Learners (2020) GPT-1 unlabeled data를 이용해 전반적인 단어의 임베딩을 먼저 갖..
-
[Story Generation Study Week 01 : Fundamental of Text Generation] GPT-3 : Language Models are Few-Shot Learners (2020) ReviewAI/NLP 2022. 6. 28. 12:41
[Story Generation Study Week 01 : Fundamental of Text Generation] GPT-3 : Language Models are Few-Shot Learners (2020) Review [Story Generation Study Week 01 : Fundamental of Text Generation] GPT-1: Improving Language Understanding by Generative Pre-Training (2018) GPT-2: Language Models are Unsupervised Multitask Learners (2019) GPT-3: Language Models are Few-Shot Learners (2020) 방학 중 내가 너무너무너무..