图书介绍

信息论与编码 英文版【2025|PDF|Epub|mobi|kindle电子书版本百度云盘下载】

信息论与编码 英文版
  • 梁建武,郭迎,罗喜英等编著 著
  • 出版社: 北京:中国水利水电出版社
  • ISBN:9787508455693
  • 出版时间:2008
  • 标注页数:197页
  • 文件大小:77MB
  • 文件页数:209页
  • 主题词:信息论-高等学校-教材-英文;信源编码-编码理论-高等学校-教材-英文;信道编码-编码理论-高等学校-教材-英文

PDF下载


点此进入-本书在线PDF格式电子书下载【推荐-云解压-方便快捷】直接下载PDF格式图书。移动端-PC端通用
种子下载[BT下载速度快]温馨提示:(请使用BT下载软件FDM进行下载)软件下载地址页直链下载[便捷但速度慢]  [在线试读本书]   [在线获取解压码]

下载说明

信息论与编码 英文版PDF格式电子书版下载

下载的文件为RAR压缩包。需要使用解压软件进行解压得到PDF格式图书。

建议使用BT下载工具Free Download Manager进行下载,简称FDM(免费,没有广告,支持多平台)。本站资源全部打包为BT种子。所以需要使用专业的BT下载软件进行下载。如BitComet qBittorrent uTorrent等BT下载工具。迅雷目前由于本站不是热门资源。不推荐使用!后期资源热门了。安装了迅雷也可以迅雷进行下载!

(文件页数 要大于 标注页数,上中下等多册电子书除外)

注意:本站所有压缩包均有解压码: 点击下载压缩包解压工具

图书目录

Chapter 1 Introduction1

Contents1

Before it starts,there is something must be known1

1.1 What is Information2

1.2 What's Information Theory?4

1.2.1 Origin and Development of Information Theory4

1.2.2 The application and achievement of Information Theory methods6

1.3 Formation and Development of Information Theory7

Questions and Exercises8

Biography of Claude Elwood Shannon8

Chapter 2 Basic Concepts of Information Theory11

Contents11

Preparation knowledge11

2.1 Self-information and conditional self-information12

2.1.1 Self-Information12

2.1.2 Conditional Self-Information14

2.2 Mutual information and conditional mutual information14

2.3 Source entropy16

2.3.1 Introduction of entropy16

2.3.2 Mathematics description of source entropy17

2.3.3 Conditional entropy20

2.3.4 Union entropy(Communal entropy)20

2.3.5 Basic nature and theorem of source entropy21

2.4 Average mutual information26

2.4.1 Definition26

2.4.2 Physics significance of average mutual information27

2.4.3 Properties of average mutual information28

2.5 Continuous source38

2.5.1 Entropy of the continuous source(also called differential entropy)39

2.5.2 Mutual information of the continuous random variable44

Questions and Exercises44

Additional reading materials46

Chapter 3 Discrete Source Information51

Contents51

3.1 Mathematical model and classification of the source51

3.2 The discrete source without memory54

3.3 Multi-marks discrete steady source60

3.4 Source entropy of discrete steady source and limit entropy67

3.5 The source redundancy and the information difference71

3.6 Markov information source71

Exercise77

Chapter 4 Channel and Channel Capacity79

Contents79

4.1 The model and classification of the channel79

4.1.1 Channel Models79

4.1.2 Channel classifications80

4.2 Channel doubt degree and average mutual information82

4.2.1 Channel doubt degree82

4.2.2 Average mutual information82

4.2.3 Properties of mutual information function83

4.2.4 Relationship between entropy,channel doubt degree and mutual information86

4.3 The discrete channel without memory and its channel capacity88

4.4 Channel capacity89

4.4.1 Concept of channel capacity89

4.4.2 Discrete channel without memory and its channel capacity91

4.4.3 Continuous channel and its channel capacity99

Chapter 5 Lossless source coding106

Contents106

5.1 Lossless coder106

5.2 Lossless source coding110

5.2.1 Fixed length coding theorem110

5.2.2 Unfixed length source coding113

5.3 Lossless source coding theorems115

5.3.1 Classification of code and main coding method115

5.3.2 Kraft theorem116

5.3.3 Lossless unfixed source coding theorem(Shannon First theorem)116

5.4 Pragmatic examples of lossless source coding120

5.4.1 Huffman coding120

5.4.2 Shannon coding and Fano coding128

5.5 The Lempel-ziv algorithm130

5.6 Run-Length Encoding and the PCX format132

Questions and Exercises134

Chapter 6 Limited distortion source coding137

Contents137

6.1 The start point of limit distortion theory138

6.2 Distortion measurement140

6.2.1 Distortion function140

6.2.2 Average distortion142

6.3 Information rate distortion function143

6.4 Property of R(D)145

6.4.1 Minimum of D and R(D)145

6.4.2 Dmax and R(Dmax)151

6.4.3 The under convex function of R(D)154

6.4.4 The decreasing function of R(D)154

6.4.5 R(D)is a continuous function of D155

6.5 Calculation of R(D)156

6.5.1 Calculation of R(D)of binary symmetric source156

6.5.2 Calculation of R(D)of Gauss source158

6.6 Limited distortion source encoding theorem159

Additional material for this chapter161

Questions and exercises168

Chapter 7 Channel Coding Theory170

Contents170

7.1 Channel coding theorem for noisy channel170

7.2 Introduction:the generator and parity-check matrices174

7.3 Syndrome decoding on q-ary symmetric channels177

7.4 Hamming geometry and code performance179

7.5 Hamming codes180

7.6 Cyclic code181

7.7 Syndrome decoding on general q-ary channels191

Questions and exercises194

Bibliography197

热门推荐