title>GB/T 5271.16-1986 Data processing vocabulary Part 16 Information theory - GB/T 5271.16-1986 - Chinese standardNet - bzxz.net
Home > GB > GB/T 5271.16-1986 Data processing vocabulary Part 16 Information theory
GB/T 5271.16-1986 Data processing vocabulary Part 16 Information theory

Basic Information

Standard ID: GB/T 5271.16-1986

Standard Name: Data processing vocabulary Part 16 Information theory

Chinese Name: 数据处理词汇 16部分 信息论

Standard category:National Standard (GB)

state:Abolished

Date of Release1986-07-03

Date of Implementation:1987-05-01

Date of Expiration:2008-12-01

standard classification number

Standard ICS number:Information technology, office machinery and equipment >> 35.020 Information technology (IT) general

Standard Classification Number:Electronic Components and Information Technology>>Information Processing Technology>>L70 Comprehensive Information Processing Technology

associated standards

alternative situation:Replaced by GB/T 5271.16-2008

Procurement status:=ISO 2382/16-78

Publication information

publishing house:China Standards Press

Publication date:1987-05-01

other information

Release date:1986-07-31

Review date:2004-10-14

Drafting unit:Chengdu Institute of Telecommunication Engineering

Focal point unit:National Information Technology Standardization Technical Committee

Publishing department:National Standardization Administration

competent authority:National Standardization Administration

Introduction to standards:

This standard applies to the design, production, use, maintenance, management, scientific research, teaching and publishing of all fields related to electronic computers and information processing. GB/T 5271.16-1986 Data Processing Vocabulary Part 16 Information Theory GB/T5271.16-1986 Standard download decompression password: www.bzxz.net

Some standard content:

1 Overview
1.1 Introduction
National Standard of the People's Republic of China
Data processing -- Vocabulary
Part 16 Information theory
Data processing -- Vocabulary Section 16: Information theory UDC 681.3:001.4
GB5271.16-86
This vocabulary consists of about twenty parts. This part mainly deals with Shannon's information theory. It includes the basic terms used for a general introduction to information theory, as well as derived quantitative terms that are extremely useful in practical applications. This part uses a form of expression throughout, but its purpose is not to standardize this expression. Nor does it provide a precedent for other publications.
This part of the vocabulary equivalently adopts the international standard IS02382/161978 "Data processing vocabulary: Part 16: Information theory".
1.2 Scope
This vocabulary selects terms and concise definitions of some concepts in the field of data processing, and clarifies the relationship between different concepts to facilitate domestic and international exchanges.
The vocabulary covers all major aspects of data processing, including the main processing processes and the types of equipment used, data representation, data organization, data description, computer programming and operation, external devices, data communication and other special applications. 1.3 Scope of application
This standard applies to the design, production, use, maintenance, management, scientific research, teaching and publishing of all fields related to electronic computers and information processing.
2 Principles and rules to be followed
The following rules have been described in detail in GB5271.1-85 "Data Processing Vocabulary Part 01 Basic Terms". They are also applicable to this part and are not repeated here. Only the titles of each item are listed as follows: 2.1 Definition of entries;
2.2 Composition of entries;
Classification of entries:
Choice of terms and definitional terms;
2.5 Polysemous terms:
Abbreviations;
Use of parentheses:
Use of square brackets;
Use of boldface terms and asterisks in definitions; Spelling;
Compilation of index tables.
Promulgated by the National Bureau of Standards on July 31, 1986
Implementation of May 1987
~01
3 Terms and Definitions
16 Information theory
16.01—General terms
16.01.01 Information theory
information theory
GB 5271.16
The branch of discipline that studies information measurement and its properties. 16.01.02
Communication theory
communication theory
The branch of mathematics that studies the probabilistic characteristics of message transmission in the presence of noise and other interference. Information measure
measure of information
An appropriate function of the probability of occurrence of an event or sequence of events in a set of random events. Note: In information theory, the term "event" has the same meaning as that used in probability theory. For example, an event might be: the occurrence of a given element in a set; or the occurrence of a particular character or words at a given position in a message. 16.02 Messages and Communication
16.02.01 Message (in information theory and communication theory) An ordered sequence of characters used to convey information.
message source
information source
information source
the part of a communication system that sends messages.
message sink
the part of a communication system that receives messages. 16.02.04
channel (in communication theory)
channel (in communication theory) The part of a communication system that connects a message source to a message sink. Let:! An encoder may be inserted between the message source and the channel input, and a decoder between the channel input and the message sink. Generally speaking, neither of these is part of the channel. In some cases, they can be considered as part of the source and sink, respectively.
2 In Shannon information theory, a channel can be characterized by a set of conditional probabilities: the conditional probabilities of all possible messages being received by the sink when the source sends any given message. 16.02.05 Symmetric binary channel
symmetric binary channel
A channel used to transmit messages consisting of binary characters, and having the following property: the conditional probability of any character being changed to another character is equal.
Stationary message source
stationary message source
stationary information source
stationary information source A message source in which the probability of occurrence of each message is independent of the time of occurrence. 255
16.03 Basic quantitative terms
decision content
decision content
GB 5271.16-86
The logarithmic measure of the number of decisions required to select a given event from a finite number of incompatible events. Its mathematical expression is:
Ho-logan
Where: n is the number of
events.
Example: See the example at the end of 16.03.
Note: "The note to 16.1.(3 also applies to this definition. 2 The base of the logarithm determines the unit used.
The decision quantity is independent of the probability of occurrence of each event: but in some applications, these probabilities can also be considered equal. 16.03.02
Information content
information content
·The information measure given when an event with a certain probability occurs. Its mathematical expression is: This measure of event () is the logarithm of the reciprocal probability p () of the occurrence of the event, that is, I () = logap(x)
= - logap(α)
Example: See the example at the end of 16.03.
entropy
mean information content
average information content
The average information measure given by the occurrence of any event in a complete event set consisting of a finite number of incompatible events with certain probabilities. Its mathematical expression is: for a set of events αn with probabilities p()…(n), this average value H() is equal to the mathematical mean of the information volume 1() of each event. H(x)
Zp(c)logo p(x)
Ep(α)I()
Ep(x,)logap(x)
Example: See the example at the end of 16.03.
Relative entropy
relative entropy
The ratio of H to the decision volume H. Its mathematical expression is: H,
16.03.05 Redundancy (in information theory) theory)H
decision quantity H. The number of H. Its mathematical expression is: 256
GB 5271.16-86
RH. ·H
Note: Generally speaking, using appropriate codes can reduce the number of characters representing messages. The margin can be used to measure how much the coding can shorten the average length of the message.
shannon
binary unit of information content
binary unit of information contentA unit of information measurement, which is equal to the decision quantile of a set of two mutually exclusive events expressed as a logarithm with base 2.
Example: The decision quantile of a character set consisting of 8 characters is equal to 3Shannon, log, 8: 3Note: Strictly avoid using bit as the binary unit of information contentHartley
hartley
decimal unit of information content
decimal unit of information contentA unit of information measurement, which is equal to the decision quantile of a set of ten mutually exclusive events expressed as a logarithm with base 10.
Example: The decision quantile of a character set consisting of 8 characters is equal to 0.903Hartley, log, 8=0.903Natural unit of information
natural unit of information contentNAT (abbreviation)
Unit of information measurement, which is expressed in natural logarithms. Example: A character set consisting of 8 characters has a decision value equal to 2.079 natural units, 10g. 8=310g. 2=2.079 Example: Suppose a character set consists of three characters α, b, and c. For a given message source, their probability of occurrence is as follows:
p(α)=
() The decision value of this character set is:
log,3 = 1.580
logt3-0.477
loge3 = 1.098
? ) The amount of information of these characters is:
p(b)=p(c)=
Hartley
natural unit
log,2~ 1
log1.2=0.301
log.2 -0.693
I(b)=I(c)=
Hartley
natural unit
log24 2
log1o4=0.602
loge4 1.386
Hartley
natural unitbzxz.net
3) If the appearance of these three characters in any message is independent of each other, the entropy of this message source is: 257
GB 5271.16
1.5 × 0.301
1. 5 × 0. 693
1 The degree of redundancy of this source of elimination:
16.04 Derived quantitative terms
Relative redundancy
Hartley
Hartley
Natural unit
Natural unit
The ratio of the redundancy R to the decision quantity H., which is mathematically expressed as: RH-H
Conditional information
conditional information content is the information measure given by the occurrence of an event with a certain conditional probability when another event is known to occur. Its mathematical expression is: If the occurrence of an event in the set,...n depends on the occurrence of event y in another set,ym, then this conditional information content "(!y) is equal to the logarithm of the reciprocal of the conditional probability () of the event occurrence when the condition y occurs.
I(x. I y,)= loga
p(αj)
joint information content
joint information content
The information measure given by the occurrence of two events with a certain joint probability. Its mathematical expression is: if α, y are two events in the event set ...zn and yym, then this measure p(αs,) is equal to the logarithm of the reciprocal of the joint probability p(αi, y) between the two events I(αy,)= loga-
conditional entropy
conditional entropy
average conditional information contentmean conditional information contentWhen the occurrence of an event in another mutually exclusive event set is known, in a complete event set consisting of a finite number of mutually exclusive events with a certain conditional probability, the average value of the information measure given by the occurrence of any event. Its mathematical expression is: if the occurrence of events in the event set α,cn depends on the occurrence of events in another event set y.ym, and the joint probability of the two events α,,, occurring simultaneously is p(,), then this mean H(α) is equal to the mathematical expectation of the conditional information quantity 『(.ly,) of all pairs of documents. 238
Degree of doubt
equivocation
GB5271.16-86
H(αly)zp(αy)I(αly)
2zp)= loga-
conditional flame
conditional entropy
average conditional information content
average conditional information contentmean conditional information contentWhen an event in another mutually exclusive event set is known to occur, in a complete event set consisting of a finite number of mutually exclusive events with a certain conditional probability, the average value of the information measure given by the occurrence of any event. Its mathematical expression is: if the occurrence of an event in the event set α,cn depends on the occurrence of an event in another event set y.ym, and the joint probability of the two events α,, occurring at the same time is p(,), then this mean H(α) is equal to the mathematical expectation of the conditional information content 『(.ly,) of all pairs of examination documents. 238
doubt||equivocation
GB5271.16-86
H(αly)zp(αy)I(αly)
2zp)= loga-
conditional flame
conditional entropy
average conditional information content
average conditional information contentmean conditional information contentWhen an event in another mutually exclusive event set is known to occur, in a complete event set consisting of a finite number of mutually exclusive events with a certain conditional probability, the average value of the information measure given by the occurrence of any event. Its mathematical expression is: if the occurrence of an event in the event set α,cn depends on the occurrence of an event in another event set y.ym, and the joint probability of the two events α,, occurring at the same time is p(,), then this mean H(α) is equal to the mathematical expectation of the conditional information content 『(.ly,) of all pairs of examination documents. 238
doubt||equivocation
GB5271.16-86
H(αly)zp(αy)I(αly)
2zpThe mathematical expectation of
Tp(+,)T(ly,)
Note: The average transfer information is also equal to the sum of ... The average entropy per symbol can be expressed in harmonic Shannon units. If the source of the elimination is not semi-stable, the limit of H2/m may not exist. 16.04.10 (Average) information rate
The average value of each character per unit time, expressed mathematically as follows: The information rate H* is equal to the average value H of each character divided by the mathematical expectation of the duration of any character in the character set n, that is, H*
Where:
Note: "The (average) information rate can be expressed in units such as Shannon per second. The mean transinformation content (per character) is the character average of the mean transinformation content of all possible messages of a stationary message source. Its mathematical expression is: For a sequence pair consisting of an input sequence of m characters and a corresponding output sequence of m characters, let Tm represent the mean transinformation content of all sequence pairs, then the mean transinformation content per character T" is Tm
T'= lim
Note: The average amount of information transferred per character can be expressed in units such as Shannon per character. (Average) transinformation rate
(average) transinformation rate The average amount of information transferred per character per unit time. Its mathematical expression is: the transinformation rate T* is equal to the average amount of information transferred per character T divided by the mathematical expectation of the duration t of any composite event (,), that is, *
Where:
t=zEtup(α,y,)
Note: The transinformation rate can be expressed in units such as Shannon per second. Channel capacity
channel capacity
The measure of the ability of a given channel to transmit messages from a particular source subject to certain constraints. It can be expressed as the maximum possible value of the average amount of information transferred per character or the maximum possible value of the average information rate, which can be achieved with appropriate codes with arbitrarily small error probabilities. 261
Suspicion
Symmetric binary channel
Hartley
Mutual trust
Decision
Joint trust Information volume
Average information volume (per character)
Average transfer information volume (per character)
Information rate (per character)
Dispersion
Departure
Average conditional information volume
Average information volume
(Average) information rate
Average transfer information (volume)
(Average) transfer information rate
Stationary information source
Stationary information source
GB 5271.16--86
Chinese index
(reference)
Redundancy
Conditional entropy
Conditional information content
Communication theory
Relative redundancy
Relative entropy
Message sink
Message source
Channel capacity
Information measure
Information content
Information content binary unit
Information content decimal unit
Information content natural unit
Information theory
Information source
Transferred information (amount)
GB 5271.16 ---86
Appendix B
English index
(reference)
average conditional information contentaverage information content
average information content (per character)(average) information rate
average transinformation (content)(average) transinformation rateB
binary unit of information contentC
channel
channel capacity
communication theory
conditional entropy
conditional information contentD
decimal unit of information contentdecision content
entropy
equivocation
hartley
information content
information rate (per character) information source
information theory
irrelevance
joint information content
GB 5271.16-86
mean conditional information contentmean entropy (per character)mean information content
mean information content ( per character)mean transinformation (content)mean transinformation content (per character)measure of information
message
message sink
message source
mutual information
natural unit of information contentP
prevarication
redundancy
relative entropy
relative redundancy
shannon
spread
stationary information sourcestationary message source
symmetric binary channel
transferred information
transinformation (content )
transmitted information
Additional notes:
This standard was proposed by the Ministry of Electronics Industry of the People's Republic of China. This standard was drafted by Chengdu Institute of Telecommunications Engineering and Hangzhou Institute of Electronic Industry 16.04.0.1
The main drafters of this standard are Zhang Hongji, Chong Kuang, Zhang Zhihao, Yang Xuming, Wu Zhongxian, Xu Zixing, Zhang Yiting, Chen Pei, Xie Zhiliang, Xiang Weiliang, and Lin Ning.
Tip: This standard content only shows part of the intercepted content of the complete standard. If you need the complete standard, please go to the top to download the complete standard document for free.