Welcome![Sign In][Sign Up]
Location:
Search - entropy theory

Search list

[Program docH[1].26L中熵编码原理分析及改进

Description: H.26L中熵编码原理分析及改进.PDF-H.26L entropy coding theory analysis and improvement. PDF
Platform: | Size: 174080 | Author: 罗生 | Hits:

[matlabxxl

Description: 信息论中各种熵的matlab实现,其中包括自信息量,互信息量,条件熵,联合熵,冗余度等等的计算。-various information theory entropy Matlab achieved, which include the volume of information, mutual information, the conditional entropy, Joint entropy, and so on the redundancy terms.
Platform: | Size: 1024 | Author: 孟旭 | Hits:

[Otherxxlybmll

Description: 信息论和编码理论 信源熵 单符号离散信源 第一节 信源的数学模型 第二节 信源符号的自信量 第三节 信源的信息熵 第四节 信息熵的代数性质...双输入单输出信道的信道容量 第二节 离散二址接人信道-Information theory and coding theory of single-symbol source entropy discrete source Source Section I of the mathematical model of Section II source symbols of self-confidence volume III Source of information entropy in section IV of the algebraic properties of information entropy ... dual-input single- output channel of the channel capacity of discrete Second Section II Access Channel
Platform: | Size: 138240 | Author: doris | Hits:

[Othermodernspectrumestimation

Description: 何振亚的现代谱估计.本书全面系统地论述了现代谱估计技术中的各种理论与方法。全书共九章,内容包括;纯连续谱沽计的AR与ARMA模型参量法、纯离散谱与混合谱估汁的正弦组合与阻尼复指数模型参量法、非参量法谱估计的最小方差法与奇异值/特征值分解处理法、应用信息论的墒谱估汁法、多谱(高阶谱)估计、二维潜估计与多维阵列谱估计。本书取材广泛,内容新颖,理论联系实际,系统性强。 -He Zhenya Modern spectral estimation. This book comprehensively and systematically discusses the modern spectral estimation techniques in a variety of theories and methods. Book a total of nine chapters, including pure continuous spectrum of sell ARMA model with AR parameter method, pure discrete spectrum and mixed spectrum estimation of sinusoidal juice combination of complex exponential model with the damping parameter method, non-parameter method of minimum variance spectral estimation method with the singular value/eigenvalue decomposition treatment method, the application of information theory the entropy spectrum estimation juice method, multi-spectral (HOS) estimates that two-dimensional latent estimation and multi-dimensional spectral estimation array. The book based on a wide range of content of a novel theory with practice, systemic strong.
Platform: | Size: 6134784 | Author: songzy41 | Hits:

[Documentskolmogoroventropy

Description: 混沌理论kolmogirov熵参数研究用于通信流量中熵研究-Chaos Theory kolmogirov parameters of entropy in the entropy flow for communications research
Platform: | Size: 33792 | Author: 张延中 | Hits:

[matlab2

Description: 信息理论与编码——关于信息熵迭代算法的matlab仿真程序-Information Theory and Coding- on the information entropy iterative algorithm matlab simulation program
Platform: | Size: 2048 | Author: 陈子飞 | Hits:

[P2PANewTrustModelBasedonReputationandRiskEvaluation.r

Description: 针对目前大规模 P2P 系统不能有效处理恶意节点攻击的问题,该文提出一种新的基于信誉与风险评价的 P2P 系统信任模型,该模型考虑到节点的动态行为影响信任度计算的不确定性,引入风险因素,并提出采用信息 熵理论来量化风险,将实体之间的信任程度和信任的不确定性统一起来。仿真试验及分析表明,该信任模型能够有 效识别恶意节点,相比已有的一些信任模型较大程度地提高了系统成功交易率,可以使节点之间更有效地建立信任 关系。 -View of the current large-scale P2P systems can not effectively deal with malicious nodes to attack the problem, the paper presents a new risk assessment based on credibility and trust model of P2P system, the model taking into account the impact of the dynamic behavior of nodes trust the calculation of uncertainty, the introduction of risk factors, and to make use of information entropy theory to quantify the risks, will be the degree of trust between the entities and trust uncertainty unified. Simulation and analysis show that the trust model can effectively identify the malicious node, compared to some of the existing trust model to a greater extent to improve the system the rate of successful transactions, you can make between nodes more effectively to establish a relationship of trust.
Platform: | Size: 97280 | Author: 刘峰 | Hits:

[Program docshannon-report

Description: 这是一篇英文文献,文章题目是Claude Shannon and "A Mathematical Theory of Communication".是香农的《通信的数学理论》一文的读书报告。内容包括:香农的生平背景、消息、信息、熵函数、信源编码、信道编码等。-This is an English literature article entitled Claude Shannon and " A Mathematical Theory of Communication" . Are Shannon' s " Mathematical Theory of Communication," one man, reading the report. Include: the context of Shannon' s life, message, information, entropy function, source coding, channel coding.
Platform: | Size: 259072 | Author: yuan | Hits:

[Windows DevelopProbability_and_Information_An_Integrated_Approac

Description: This an updated new edition of the popular elementary introduction to probability theory and information theory, now containing additional material on Markov chains and their entropy. Suitable as a textbook for beginning students in mathematics, statistics, computer science or economics, the only prerequisite is some knowledge of basic calculus.-This is an updated new edition of the popular elementary introduction to probability theory and information theory, now containing additional material on Markov chains and their entropy. Suitable as a textbook for beginning students in mathematics, statistics, computer science or economics, the only prerequisite is some knowledge of basic calculus.
Platform: | Size: 1320960 | Author: moatasem momtaz | Hits:

[Program docInformationTheoryAndCodingTechniques

Description: 信息论是由通信技术、概率论、随机过程和数理统计等相结合逐步发展而形成的一门新兴科学。其奠基人是美国数学家香农,他的《通信的数学原理》奠定了信息论的基础。 这是信息论与编码技术的教学课件。压缩包内含有6个ppt文件,每个ppt自成一章:1绪论、2信源及其熵、3信道及其容量、4信息率失真函数、5信源编码、6信道编码。是初学信息论的最好材料。-Information communication technology is, probability theory and mathematical statistics, stochastic processes, such as the combination formed by the progressive development of an emerging science. The founders of the United States Shannon mathematician, his " Mathematical Theory of Communication" has laid a foundation of information theory. This is the information theory and coding technology courseware. Compressed packet contains 6 ppt documents, each a separate chapter ppt: 1 Introduction, 2 source and its entropy, and its 3-channel capacity, information rate-distortion function 4, 5, source coding, channel coding 6. Information theory is the best learning materials.
Platform: | Size: 3452928 | Author: yuan | Hits:

[Special EffectsMaximumShannoninformationentropysegement

Description: Maximum Shannon information entropy的图像分割,是结合概率统计的方法进行的分割,理论上可以得到图像分割的更多细节。打开即可使用-Maximum Shannon information entropy of the image segmentation, is a combination of the methods of probability and statistics division, in theory, image segmentation can get more details. Open to use
Platform: | Size: 31744 | Author: 南风吹 | Hits:

[MacOS developToolboxv1.0

Description: Joaquin Goñ i <jgoni@unav.es> & Iñ igo Martincorena <imartincore@alumni.unav.es> University of Navarra - Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain). December 13th, 2007. Information Theory Toolbox v1.0 The Toolbox includes: -entropy -conditional entropy -mutual information -redundancy -symmetric uncertainty -Kullback-Leibler divergence -Jensen-Shannon divergence type help for each command to get a detailed description Citation: If you use them for your academic research work,please kindly cite this toolbox as: Joaquin Goñ i, Iñ igo Martincorena. Information Theory Toolbox v1.0. University of Navarra - Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain). -Joaquin Goñ i <jgoni@unav.es> & Iñ igo Martincorena <imartincore@alumni.unav.es> University of Navarra- Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain). December 13th, 2007. Information Theory Toolbox v1.0 The Toolbox includes: -entropy -conditional entropy -mutual information -redundancy -symmetric uncertainty -Kullback-Leibler divergence -Jensen-Shannon divergence type help for each command to get a detailed description Citation: If you use them for your academic research work,please kindly cite this toolbox as: Joaquin Goñ i, Iñ igo Martincorena. Information Theory Toolbox v1.0. University of Navarra- Dpt. of Physics and Applied Mathematics & Centre for Applied Medical Research. Pamplona (Spain).
Platform: | Size: 16384 | Author: le thanh tan | Hits:

[AlgorithmEntropyweight

Description: 利用信息熵原理,根据数据的混乱度,确定各个信息的权重系数-The use of information entropy theory, the chaotic degree of the data to determine the weight coefficients of the various information
Platform: | Size: 1024 | Author: yxc | Hits:

[OtherEntropy-basedAlgorithmforDiscretizationofContinuou

Description: 这篇文章主要讲述了,数据挖掘中基于熵理论的算法。-This article is mainly about, and data mining algorithms based on entropy theory.
Platform: | Size: 180224 | Author: gfshycm | Hits:

[Special Effectsentropy

Description: The Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". Shannon s entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon s source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
Platform: | Size: 2048 | Author: jeremy | Hits:

[matlabfilter1

Description: 高通滤波器,带通滤波器,数字FIR滤波器-The Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication". Shannon s entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon s source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
Platform: | Size: 1024 | Author: jeremy | Hits:

[Othershiyanbaogao

Description: matlab实现信息论实验,一般信道容量迭代算法实验,绘制二元熵函数曲线实验等。-matlab experimental realization of information theory, iterative algorithm for general channel capacity test, drawing binary entropy function curve experiments.
Platform: | Size: 65536 | Author: zzh | Hits:

[OS programentrop

Description: 这是信息论中熵的计算公式,很实用,希望对大家有所帮助。-This is the formula of entropy in information theory, it is useful, we want to help.
Platform: | Size: 1024 | Author: 宋志斌 | Hits:

[Special Effectsthe-Renyi-entropy-theory

Description: 建武和申铉京等人充分利用图像空间邻域信息,引入均值-中值-梯度共生矩阵模型,并结合Renyi熵相关理论, 提出一种结合纹理信息的三维Renyi熵阈值分割算法. 同时给出了该方法的快速递推公式。-Jianwu and Shin Hyun Beijing, who make full use of the image spatial information, the introduction of the mean- in value- gradient co-occurrence matrix models, combined with the Renyi entropy theory, put forward a combination of the three-dimensional texture information Renyi entropy threshold segmentation algorithm at the same time to fast recursive formula of the method.
Platform: | Size: 897024 | Author: 朱磊 | Hits:

[Special Effects3D-Renyi-entropy-theory

Description: 魏巍等人提出了一种基于灰度、邻域均值和邻域加权中值三维直方图的最大Renyi熵阈值分割算法。该方法无论在精度上还是抗噪性上都有明显提升。-Wei Wei, who based on gray-scale, neighborhood average and in the neighborhood weighted value of three-dimensional histogram of the maximum Renyi entropy threshold segmentation algorithm. The method in terms of accuracy and anti-noise immunity has improved significantly.
Platform: | Size: 653312 | Author: 朱磊 | Hits:
« 12 3 4 5 6 »

CodeBus www.codebus.net