没有合适的资源?快使用搜索试试~ 我知道了~
applications_of_Entropy_in_Finance_A_Review.pdf
需积分: 10 0 下载量 181 浏览量
2020-09-10
10:21:43
上传
评论
收藏 1.09MB PDF 举报
温馨提示
试读
23页
What makes RQE attractive comparatively to other entropy measures used in finance (see Zhou et al., 2013) is the dissimilarity matrix
资源推荐
资源详情
资源评论
Entropy 2013, 15, 4909-4931; doi:10.3390/e15114909
entropy
ISSN 1099-4300
www.mdpi.com/journal/entropy
Review
Applications of Entropy in Finance: A Review
Rongxi Zhou, Ru Cai and Guanqun Tong *
School of Economics and Management, Beijing University of Chemical Technology, Beijing 100029,
China; E-Mails: zrx103@126.com (R.Z.); cairu0404@sina.com (R.C.)
* Author to whom correspondence should be addressed; E-Mail: tonggq@buct.edu.cn;
Tel.: +86-10-64454290. Fax: +86-10-64438793.
Received: 27 September 2013; in revised form: 20 October 2013 / Accepted: 30 October 2013 /
Published: 11 November 2013
Abstract: Although the concept of entropy is originated from thermodynamics, its concepts
and relevant principles, especially the principles of maximum entropy and minimum
cross-entropy, have been extensively applied in finance. In this paper, we review the
concepts and principles of entropy, as well as their applications in the field of finance,
especially in portfolio selection and asset pricing. Furthermore, we review the effects of the
applications of entropy and compare them with other traditional and new methods.
Keywords: entropy; finance; the principle of maximum entropy; applications; portfolio
selection; asset pricing
PACS Codes: 89.65.-s Social and economic systems
1. Introduction
The history of the word ―entropy‖ can be traced back to 1865 when the German physicist Rudolf
Clausius tried to give a new name to irreversible heat loss, what he previously called ―equivalent-value‖.
The word ―entropy‖ was chosen because in Greek, ―en+tropein‖ means ―content transformative‖ or
―transformation content‖ [1]. Since then entropy has played an important role in thermodynamics. Being
defined as the sum of ―heat supplied‖ divided by ―temperature‖ [2], it is central to the Second Law of
Thermodynamics. It also helps measure the amount of order and disorder and/or chaos. Entropy can be
defined and measured in many other fields than the thermodynamics. For instance, in classical physics,
entropy is defined as the quantity of energy incapable of physical movements. Von Neumann used the
OPEN ACCESS
Entropy 2013, 15 4910
density matrix to extend the notion of entropy to quantum mechanics. The entropy of a random variable
measures uncertainty in probability theory. Entropy quantifies the exponential complexity of a
dynamical system, that is, the average flow of information per unit of time in the theory of dynamical
systems. In sociology, entropy is the natural decay of structures [3].
Brissaud suggested that entropy could be understood in three aspects [4]: Firstly, in the field of
information, entropy represents the loss of information of a physical system observed by an outsider,
but within the system, entropy represents countable information. Secondly, entropy measures the
degree of freedom. A typical example is gas expansion: the degree of freedom of the position of gas
molecules increases with time. Finally, Brissaud believed that entropy is assimilated to disorder.
However this conception seems inappropriate to us since temperature is a better measure of disorder.
The application of entropy in finance can be regarded as the extension of the information entropy
and the probability entropy. It can be an important tool in portfolio selection and asset pricing.
Philippatos and Wilson were the first two researchers who applied the concept of entropy to
portfolio selection [5]. In their thesis, a mean-entropy approach was proposed and compared to traditional
methods by constructing all possible efficient portfolios from a randomly selected sample of monthly
closing prices on 50 securities over a period of 14 years. They found that the mean-entropy portfolios
were consistent with the Markowitz full-covariance and the Sharpe single-index models. Though their
research had several drawbacks, they made great contributions to the field of portfolio selection.
Since then many other scholars have enriched the portfolio selection theory with entropy concepts.
Some of them have proposed different forms of entropy. More generalized forms of entropy such as the
incremental entropy were created. Compared to the traditional portfolio selection theory, the theory based
on the incremental entropy emphasized that there was an optimal portfolio for a given probability of
return [6]. Some kinds of hybrid entropy were also used in portfolio selection Because the hybrid entropy
can measure the risk of securities, some scholars applied the hybrid entropy to the original portfolio
selection models. For instance, Xu et al. [7] investigated portfolio selection problems by utilizing the
hybrid entropy to estimate the asset risk caused by both randomness and fuzziness. Usta and Kantar [8]
tested the mean-variance-skewness-entropy model with the entropy element, which performed better
than traditional portfolio selection models in out-of-sample tests. After proposing a mean-variance-skewness
model for portfolio selection, Jana et al. [9] added the entropy objective function to generate a
well-diversified asset portfolio within optimal asset allocation. Zhang, Liu and Xu developed a possibilistic
mean-semivariance-entropy model for multi-period portfolio selection with transaction costs [10]. Zhou et
al. formulated a portfolio selection model with the measures of information entropy-incremental
entropy-skewness in which the risk of the portfolio was measured by information entropy [11]. Smimoua,
Bector and Jacoby considered the derivation of portfolio modeling under a fuzzy situation [12]. Huang
proposed a simple method to identify the mean-entropic frontier and developed fuzzy mean-entropy
models [13]. Rödder et al. [14] presented a new theory to determine the portfolio weights by a rule-based
inference mechanism under both maximum entropy and minimum relative entropy.
Similarly entropy has been applied in option pricing. A typical example is the Entropy Pricing Theory
(EPT) introduced by Gulko [15], whose research indicated that the EPT can offer some similar valuation
results equal to the Sharpe-Lintner capital asset pricing model and the Black-Scholes formula. He also
applied the EPT to stock option pricing [16] and bond option pricing [17]. The EPT model was simple and
user-friendly, and its formalism made the Efficient Market Hypothesis operational.
Entropy 2013, 15 4911
The Principle of Maximum Entropy (MEP) plays an important role in option pricing as well. Back in
1996, Buchen and Kelly [18] used the MEP to estimate the distribution of an asset from a set of option
prices. Their research showed that the maximum entropy distribution was able to fit a known probability
density function accurately. It could simulate option prices at different strike prices.
Buchen and Kelly’s method had a significant impact. It attracted many others to extend their research
and compare all kinds of methods. For example, Neri and Schneider [19] developed a simple robust test
for the maximum entropy distribution and tested several samples. They also compared their results to
Buchen and Kelly’s. Their methods performed very well both in their two examples from the Chicago
Board Options Exchange and they drew the same conclusions as Buchen and Kelly.
Besides the works mentioned above, the maximum entropy method could be used to estimate the
implied correlations between different currency pairs [20], to retrieve the neutral density of future stock
risks or other asset risks [21], and to infer the implied probability density and distribution from option
prices [22,23]. Stutzer and Hawkins [24,25] even used the MEP to price derivative securities such as
futures and swaps.
Another useful relevant principle is the Minimum Cross-Entropy Principle (MCEP). In 1951, this
principle was developed by Kullback and Leibler [26], and it has been one of the most important entropy
optimization principles. In 1996, Buchen and Kelly extended their own research from the MEP to the
MCEP [18]. Their results showed that the MCEP has the same effect with the MEP. Four years after
Buchen and Kelly’s research, Frittelli discovered sufficient conditions for a unique equivalent
martingale measure minimized relative entropy [27]. He also provided a financial interpretation of the
minimal entropy martingale measure. The minimal entropy martingale measure could be used in option
pricing, which was proved by Benth and Groth [28]. Hunt and Devolder found an explicit characterization
of the minimal entropy martingale measure to deal with the market incompleteness [29]. Their model was
proved again very useful in empirical implementations. Grandits minimized the Tsallis cross-entropy
and told its connection with the minimal entropy martingale measure [30]. In 2004, Branger used the
minimum cross-entropy measure to choose a stochastic discount factor (SDF) given a benchmark SDF
and to determine the Arrow-Debreu (AD) prices given some sets of benchmark AD prices [31].
The rest of this paper is arranged as follows: some of the major concepts of entropy used in finance
are presented in the next section. In Section 3 we review the principles of entropy useful in finance.
Section 4 introduces the applications of entropy in portfolio selection. Section 5 is devoted to the
applications of entropy in asset pricing, especially in option pricing. Section 6 briefly shows other
applications of entropy in finance and the last section concludes.
2. Concepts of Entropy Used in Finance
2.1. The Shannon Entropy
The Shannon entropy [32] of a probability measure on a finite set X is given by:
(1)
where
and 0 ln 0 = 0.
Entropy 2013, 15 4912
When dealing with continuous probability distributions, a density function is evaluated at all values
of the argument. Given a continuous probability distribution with a density function f(x), we can define
its entropy as:
(2)
where
and f(x)≥0.
2.2. The Tsallis Entropy
For any positive real number α, the Tsallis entropy of order α of a probability measure p on a finite set
X is defined as [33]:
(3)
Although these entropies are most often named after Tsallis due to his work in the area [33], they
had been studied by others long before him. For example, Havrda and Charvát [34] introduced a
similar formula in information theory in 1967, and in 1982, Patil and Taillie used H
α
as a measure of
biological diversity [35]. The characterization of the Tsallis entropy is the same as that of the Shannon
entropy except that for the Tsallis entropy, the degree of homogeneity under convex linearity condition is
α instead of 1.
2.3. The Kullback Cross-entropy
If we have no other information other than that each
and the sum of the probabilities is unity,
we have to assume the uniform distribution due to Laplace’s principle of insufficient reasons. It is a
special case of the principle of maximum uncertainty according to which the most uncertain distribution
is the uniform distribution. In other words, being most uncertain means being most close to the uniform
distribution. Therefore we need a measure of the ―distance‖ between two probability distributions:
and
.
Kullback and Leibler proposed the Kullback cross-entropy which is one of the simplest measures
satisfying all of our requirements for distance [26]:
(4)
2.4. The Tsallis Relative Entropy
In 1998, Tsallis [36] introduced a generalization of Kullback cross-entropy called the Tsallis relative
entropy or q-relative entropy. It is given as:
Entropy 2013, 15 4913
(5)
where
is a probability distribution and
is a reference distribution. For uniform
the
Tsallis relative entropy reduces to negative Tsallis entropy
, which is described in subsection 2.2
and formula (3).
2.5. The Fuzzy Entropy
Fuzzy entropy is an important research topic in fuzzy set theory. Luca and Termini [37] were the first
to define a non-probabilistic entropy with the use of fuzzy theory. Other scholars such as Bhandari and
Pal [38], Kosko [39], Pal and Bezdek [40], and Yager [41] have also given their definitions. These
entropy definitions are characterized by the uncertainty resulting from linguistic vagueness instead of
information deficiency.
Based on credibility, Li and Liu [42,43] proposed a new definition of fuzzy entropy characterized
by the uncertainty resulting from the information deficiency due to failing to predict specified values
accurately.
A general definition of the expected value of a fuzzy variable ξ with membership function
is
given as:
(6)
where
,
, and A is any
subset of the real numbers R. The function
is almost equal to
, is also referred to as the
possibility distribution of ξ.
Provided that at least one of the two integrals is finite, Equation (6) is a type of Choquet integral. The
Choquet integral is usually regarded as the generalization of mathematical expected values in
interpreting the measurement theories.
Then, its entropy is defined as:
(7)
where
with the convention that . and:
(8)
when ξ is a continuous fuzzy variable.
If fuzzy variables ξ and η are continuous, the cross-entropy of ξ from η was defined as:
(9)
剩余22页未读,继续阅读
资源评论
minuxAE
- 粉丝: 1w+
- 资源: 479
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功