没有合适的资源?快使用搜索试试~ 我知道了~
关健__Story Ending Generation with Incremental Encoding and Common
需积分: 0 0 下载量 14 浏览量
2022-08-03
13:10:09
上传
评论
收藏 851KB PDF 举报
温馨提示
试读
1页
Story Ending Generation with Incremental EncodingJian Guan∗, Yansen Wang∗, Minli
资源详情
资源评论
资源推荐
Story Ending Generation with Incremental Encoding
and Commonsense Knowledge
Jian Guan
∗
, Yansen Wang
∗
, Minlie Huang
Dept. of Computer Science & Technology, Tsinghua University, Beijing 100084, China
Institute for Artificial IntelligenceTsinghua University (THUAI), China
Beijing National Research Center for Information Science and Technology, China
Introduction
He hopes to get a lot of candy .
Today is Halloween .
Jack is so excited to go trick or treating tonight .
He is going to dress up like a monster .
The costume is real scary .
Halloween
trick or treat
dress up monster
costume is scary
get candy
trick or treat
costume Halloween
dress up
monster
scary
candy
scare
hallow
day
beast
dress
MannerOf
RelatedTo
RelatedTo
RelatedTo
RelatedTo
RelatedTo
RelatedTo
RelatedTo
. . .
. . .
Figure 1: Story Ending Generation Tasks: given a story context consisting of a sentence
sequence, generate a one-sentence sequence to conclude the story and complete the plot.
Generating a good ending requires:
•
Representing the context clues which contain key information for planning a
reasonable ending
•
Using implicit knowledge (e.g., commonsense knowledge) to facilitate
understanding of the story and better predict what will happen next.
Method
Today is Halloween .
Jack is so excited to go trick or
treating tonight .
He is going to dress up like a monster .
The costume is real scary .
He hopes to get a lot of candy .
s
...
Multi-Source Attention
Knowledge Graph Representation
Query-word
Neighboring entity
Utilized entity in the ending
Graph representation of query-word
...
Incremental Encoding
MSA
MSA
MSA
MSA
...
...
Figure 2: Model overview.
Task Overview:
Story Context × Commonsense Knowledge → Story Ending
Model: Sequence to Sequence (seq2seq) Framework
•
Encoder-Decoder with Attention: Common framework to model the mapping
from the context to the ending.
•
Incremental Encoding: Effective to represent the context clues which may
capture the key logic information.
•
Multi-Source Attention: Capture the relationship between words (or states) in
the current sentence and those in the preceding sentence, and contains implicit
knowledge that is beyond the text.
•
Knowledge graph representation: Extends (Encodes) the meaning of a word by
representing the knowledge graph from its neighboring concepts and relations.
•
Loss Function: Impose supervision on both the encoding network and decoding
network.
Experiments
Dataset
ROCStories corpus:
•
Each story consists of five sentences, our task is to generate the ending given the
first 4 sentence
•
90,000 for training and 8,162 for evaluation
•
Average length of X
1
/X
2
/X
3
/X
4
is 8.9/9.9/10.1/10.0/10.5
ConceptNet:
•
Only retrieve the relations whose head entity and tail entity are noun or verb,
meanwhile both occurring in SCT.
•
Retain at most 10 triples if there are too many for a word.
•
Average number of relations for each query word is 3.4
Evaluation
Automatic Metrics: Perplexity(PPL),BLEU-1/BLEU-2
Manual Metrics:
•
Grammar(Gram.):Whether an ending is natural and fluent. Score 2 is for endings
without any grammar errors, 1 for endings with a few errors but still understandable
and 0 for endings with severe errors and incomprehensible.
•
Logicality(Logic.):Whether an ending is reasonable and coherent with the story
context in logic. Score 2 is for reasonable endings that are coherent in logic, 1 for
relevant endings but with some discrepancy between an ending and a given context,
and 0 for totally incompatible endings.
Model PPL BLEU-1 BLEU-2 Gram. Logic.
Seq2Seq 18.97 0.1864 0.0410 1.74 0.70
HLSTM 17.26 0.2459 0.0771 1.57 0.84
HLSTM+Copy 19.93 0.2469 0.0783 1.66 0.90
HLSTM+MSA(GA) 15.75 0.2588 0.0809 1.70 1.06
HLSTM+MSA(CA) 12.53 0.2514 0.0825 1.72 1.02
IE (ours) 11.04 0.2514 0.0813 1.84 1.10
IE+MSA(GA) (ours) 9.72 0.2566 0.0854 1.68 1.26
IE+MSA(CA) (ours) 8.79 0.2682 0.0936 1.66 1.24
Table 1: Automatic and manual evaluation results.
Case Study
Context: Martha is cooking a special meal for her family.
She wants everything to be just right for when they eat.
Martha perfects everything and puts her dinner into the oven.
Martha goes to lay down for a quick nap.
Golden Ending: She oversleeps and runs into the kitchen to take out her burnt dinner.
Seq2Seq: She was so happy to have a new cake.
HLSTM: Her family and her family are very happy with her food.
HLSTM+ Copy: Martha is happy to be able to eat her family.
HLSTM+ GA: She is happy to be able to cook her dinner.
HLSTM+ CA: She is very happy that she has made a new cook .
IE: She is very happy with her family.
IE+GA: When she gets back to the kitchen, she sees a burning light on the stove.
IE+CA: She realizes the food
and is happy she was ready to cook .
Table 2: Generated endings from different models. Bold words denote the key entity and
event in the story. Improper words in ending is in italic and proper words are underlined.
Attention Visualization
Martha is cooking a special meal for her family .
She wants everything to be just right for when they eat .
Martha perfects everything and puts her dinner into the oven .
Martha goes to lay down for a quick nap .
When she gets back to the kitchen , she sees a burning light on the stove .
1
:
2
:
3
:
4
:
:
Entity commonsense knowledge
cook
(cook, AtLocation, kitchen)
(cook, HasLastSubevent, eat)
meal
(meal, AtLocation, dinner)
(meal, RelatedTo, eat)
eat (eat, AtLocation, dinner)
oven
(oven, AtLocation, stove)
(oven, RelatedTo, kitchen)
(oven, UsedFor, burn)
Figure 3: An example illustrating how incremental encoding builds connections between
context clues.
AAAI, January 27-February 1, 2019, Honolulu, Hawaii, USA, Contact: guanj15@mails.tsinghua.edu.cn
老许的花开
- 粉丝: 22
- 资源: 328
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- Swift语言教程及案例
- 汇编语言教程以及汇编器和伪指令的定义
- global,nonlocal,json和python的类的介绍
- DELL EMC POWEREDGE R440 BIOS 2.10.2 编程器备份
- IMG_20240529_122750.jpg
- 基于Java的职工工资管理系统设计源码 - salary management system
- 张律师〈人民路街道)-2405291432.awb
- Editplus 4GL Progress 高亮配色方案
- 回调函数的定义和应用场景
- meta-llama-3-8b-instruct 的 model-00002-of-00004.safetensors 的3/3
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0