首页  软件  游戏  图书  电影  电视剧

请输入您要查询的图书:

 

图书 模式识别(英文版第3版)/经典原版书库
内容
编辑推荐

模式识别在所有的自动化,信息处理和检索应用中都至关重要。本书由该领域内的两位顶级专家合著而成,从工程角度,全面阐述了模式识别的应用,涉及的主题从图像分析到语音识别与通信,书中涉及到了神经网络的前沿材料, 着重描述了包括独立分量和支持向量机在内的最新进展。本书是享誉世界的名著,经过十余年的发展,已成为此领域 最全面的参考书,被世界众多高校选用为教材。除了适合教学外,也可供工程技术人员参考。

目录

Preface

CHAPTER 1 INTRODUCTION

1.1 Is Pattern Recognition Important?

1.2 Features, Feature Vectors, and Classifiers

1.3 Supervised Versus Unsupervised Pattern

Recognition

1.4 Outline of the Book

CHAPTER CLASSIFIERS BASED ON BAYES DECISION THEORY

2.1 Introduction

2.2 Bayes Decision Theory

2.3 Discriminant Functions and Decision Surfaces

2.4 Bayesian Classification for Normal Distributions

2.5 Estimation of Unknown Probability Density

Functions

2.5.1 Maximum Likelihood Parameter Estimation

2.5.2 Maximum a Posteriori Probability

Estimation

2.5.3 Bayesian Inference

2.5.4 Maximum Entropy Estimation

2.5.5 Mixture Models

2.5.6 Nonparametric Estimation

2.6 The Nearest Neighbor Rule

CHAPTER 3 LINEAR CLASSIFIERS

3.1 Introduction

3.2 Linear Discriminant Functions and Decision

Hyperplanes

3.3 The Perceptron Algorithm

3.4 Least Squares Methods

3.4.1 Mean Square Error Estimation

3.4.2 Stochastic Approximation and the LMS

Algorithm

3.4.3 Sum of Error Squares Estimation

3.5 Mean Square Estimation Revisited

3.5.1 Mean Square Error Regression

3.5.2 MSE Estimates Posterior Class Probabilities

3.5.3 The Bias-Variance Dilemma

3.6 Support Vector Machines

3.6.1 Separable Classes

3.6.2 Nonseparable Classes

CHAPTER 4 NONLINEAR CLASSIFIERS

4.1 Introduction

4.2 The XOR Problem

4.3 The Two-Layer Perceptron

4.3.1 Classification Capabilities of the Two-Layer

Perceptron

4.4 Three-Layer Perceptrons

4.5 Algorithms Based on Exact Classification of the

Training Set

4.6 The Backpropagation Algorithm

4.7 Variations on the; Backpropagation Theme

4.8 The Cost Function Choice

4.9 Choice of the Network Size

4.10 A Simulation Example

4.11 Networks With Weight Sharing

4.12 Generalized Linear Classifiers

4.13 Capacity of the/-Dimensional Space in Linear

Dichotomies

4.14 Polynomial Classifiers

4.15 Radial Basis Function Networks

4.16 Universal Approximators

4.17 Support Vector Machines: The Nonlinear Case

4.18 Decision Trees

4.18.1 Set of Questions

4.18.2 Splitting Criterion

4.18.3 Stop-Splitting Rule

4.18.4 Class Assignment Rule

4.19 Discussion

CHAPTER 5 FEATURE SELECTION

5.1 Introduction

5.2 Preprocessing

5.2.1 Outlier Removal

5.2.2 Data Normalization

5.2.3 Missing Data

5.3 Feature Selection Based on Statistical Hypothesis

Testing

5.3.1 Hypothesis Testing Basics

5.3.2 Application of the t-Test in Feature

Selection

5.4 The Receiver Operating Characteristics CROC Curve

5.5 Class Separability Measures

5.5.1 Divergence

5.5.2 Chernoff Bound and

Bhattacharyya Distance

5.5.3 Scatter Matrices

5.6 Feature Subset Selection

5.6.1 Scalar Feature Selection

5.6.2 Feature Vector Selection

5.7 Optimal Feature Generation

5.8 Neural Networks and Feature Generation/Selection

5.9 A Hint on the Vapnik--Chemovenkis Learning

Theory

CHAPTER 6 FEATURE GENERATION I: LINEAR TRANSFORMS

6.1 Introduction

6.2 Basis Vectors and Images

6.3 The Karhunen-Loeve Transform

6.4 The Singular Value Decomposition

6.5 Independent Component Analysis

6.5.1 ICA Based on Second- and Fourth-Order

Cumulants

6.5.2 ICA Based on Mutual Information

6.5.3 An ICA Simulation Example

6.6 The Discrete Fourier Transform (DFT)

6.6.1 One-Dimensional DFT

6.6.2 Two-Dimensional DFT

6.7 The Discrete Cosine and Sine Transforms

6.8 The Hadamard Transform

6.9 The Haar Transform

6.10 The Haar Expansion Revisited

6.11 Discrete Time Wavelet Transform (DTWT)

6.12 The Multiresolution Interpretation

6.13 Wavelet Packets

6.14 A Look at Two-Dimensional Generalizations

6.15 Applications

CHAPTER 7 FEATURE GENERATION II

7.1 Introduction

7.2 Regional Features

7.2.1 Features for Texture Characterization

7.2.2 Local Linear Transforms for Texture

Feature Extraction

7.2.3 Moments

7.2.4 Parametric Models

7.3 Features for Shape and Size Characterization

7.3.1 Fourier Features

7.3.2 Chain Codes

7.3.3 Moment-Based Features

7.3.4 Geometric Features

7.4 A Glimpse at Fractals

7.4.1 Self-Similarity and Fractal Dimension

7.4.2 Fractional Brownian Motion

CHAPTER 8 TEMPLATE MATCHING

8.1 Introduction

8.2 Measures Based on Optimal Path Searching

Techniques

8.2.1 Bellman's Optimality Principle and

Dynamic Programming

8.2.2 The Edit Distance

8.2.3 Dynamic Time Warping in Speech

Recognition

8.3 Measures Based on Correlations

8.4 Deformable Template Models

CHAPTER 9 CONTEXT-DEPENDENT CLASSIFICATION

9.1 Introduction

9.2 The Bayes Classifier

9.3 Markov Chain Models

9.4 The Viterbi Algorithm

9.5 Channel Equalization

9.6 Hidden Markov Models

9.7 Training Markov Models via Neural Networks

9.8 A discussion of Markov Random Fields

CHAPTSR 10 SYSTEM EVALUATION

10.1 Introduction

10.2 Error Counting Approach

10.3 Exploiting the Finite Size of the Data Set

10.4 A Case Study From Medical Imaging

CHAPTER 11 CLUSTERING: BASIC CONCEPTS

11.1 Introduction

11.1.1 Applications of Cluster Analysis

11.1.2 Types of Features

11.1.3 Definitions of Clustering

11.2 Proximity Measures

11.2.1 Definitions

11.2.2 Proximity Measures between Two Points

11.2.3 Proximity Functions between a Point and

a Set

11.2.4 Proximity Functions between Two Sets

CHAPTER 12 CLUSTERING ALGORITHMS I: SEQUENTIAL

ALGORITHMS

12.1 Introduction

12.1.1 Number of Possible Clusterings

12.2 Categories of Clustering Algorithms

12.3 Sequential Clustering Algorithms

12.3.1 Estimation of the Number of Clusters

12.4 A Modification of BSAS

12.5 A Two-Threshold Sequential Scheme

12.6 Refinement Stages

12.7 Neural Network Implementation

12.7.1 Description of the Architecture

12.7.2 Implementation of the BSAS Algorithm

CHAPTER 13 CLUSTERING ALGORITHMS II: HIERARCHICAL

ALGORITHMS

13.1 Introduction

13.2 Agglomerative Algorithms

13.2.1 Definition of Some Useful Quantities

13.2.2 Agglomerative Algorithms Based on

Matrix Thetry

13.2.3 Monotonicity and Crossover

13.2.4 Implementational Issues

13.2.5 Agglomerative Algorithms Based on

Graph Theory

13.2.6 Ties in the Proximity Matrix

13.3 The Cophenetic Matrix

13.4 Divisive Algorithms

13.5 Choice of the Best Number of Clusters

CHAPTER 14 CLUSTERING ALGORITHMS III:

SCHEMES BASED ON FUNCTION OPTIMIZATION

14.1 Introduction

14.2 Mixture Decomposition Schemes

14.2.1 Compact and Hyperellipsoidal Clusters

14.2.2 A Geometrical Interpretation

14.3 Fuzzy Clustering Algorithms

14.3.1 Point Representatives

14.3.2 Quadric Surfacesas Representatives

14.3.3 Hyperplane Representatives

14.3.4 Combining Quadric and Hyperplane

Representatives

14.3.5 A Geometrical Interpretation

14.3.6 Convergence Aspects of the Fuzzy

Clustering Algorithms

14.3.7 Alternating Cluster Estimation

14.4 Possibilistic Clustering

14.4.1 The Mode-Seeking Property

14.4.2 An Alternative Possibilistic Scheme

14.5 Hard Clustering Algorithms

14.5.1 The Isodata or k-Means or c-Means

Algorithm

14.6 Vector Quantization

CHAPTER 15 CLUSTERING ALGORITHMS IV

15.1 Introduction

15.2 Clustering Algorithms Based on Graph Theory

15.2.1 Minimum Spanning Tree Algorithms

15.2.2 Algorithms Based on Regions of Influence

15.2.3 Algorithms Based on Directed Trees

15.3 Competitive Learning Algorithms

15.3.1 Basic Competitive Learning Algorithm

15.3.2 Leaky Learning Algorithm

15.3.3 Conscientious Competitive Learning

Algorithms

15.3.4 Competitive Learning-Like Algorithms

Associated with Cost Functions

15.3.5 Self-Organizing Maps

15.3.6 Supervised Learning Vector Quantization

15.4 Branch and Bound Clustering Algorithms

15.5 Binary Morphology Clustering Algorithms (BMCAs)

15.5.1 Discretization

15.5.2 Morphological Operations

15.5.3 Determination of the Clusters in a Discrete

Binary Set

15.5.4 Assignment of Feature Vectors to Clusters

15.5.5 The Algorithmic Scheme

15.6 Boundary Detection Algorithms

15.7 Valley-Seeking Clustering Algorithms

15.8 Clustering Via Cost Optimization (Revisited)

15.8.1 Simulated Annealing

15.8.2 Deterministic Annealing

15.9 Clustering Using Genetic Algorithms

15.10 Other Clustering Algorithms

CHAPTER 16 CLUSTER VALIDITY

16.1 Introduction

16.2 Hypothesis Testing Revisited

16.3 Hypothesis Testing in Cluster Validity

16.3.1 External Criteria

16.3.2 Internal Criteria

16.4 Relative Criteria

16.4.1 Hard Clustering

16.4.2 Fuzzy Clustering

16.5 Validity of Individual Clusters

16.5.1 External Criteria

16.5.2 Internal Criteria

16.6 Clustering Tendency

16.6.1 Tests for Spatial Randomness

Appendix A

Hints from Probability and Statistics

Appendix B

Linear Algebra Basics

Appendix C

Cost Function Optimization

Appendix D

Basic Definitions from Linear Systems Theory

Index

标签
缩略图
书名 模式识别(英文版第3版)/经典原版书库
副书名
原作名
作者 (希腊)西奥多里迪斯
译者
编者
绘者
出版社 机械工业出版社
商品编码(ISBN) 9787111197676
开本 16开
页数 837
版次 1
装订 平装
字数
出版时间 2006-09-01
首版时间 2006-09-01
印刷时间 2006-09-01
正文语种
读者对象 青年(14-20岁),研究人员,普通成人
适用范围
发行范围 公开发行
发行模式 实体书
首发网站
连载网址
图书大类 计算机-操作系统
图书小类
重量 1.052
CIP核字
中图分类号 TP391.4
丛书名
印张 53.5
印次 1
出版地 北京
241
169
30
整理
媒质 图书
用纸 普通纸
是否注音
影印版本 原版
出版商国别 CN
是否套装 单册
著作权合同登记号 图字 01-2006-3123
版权提供者 Elsevier(Singapore)
定价
印数
出品方
作品荣誉
主角
配角
其他角色
一句话简介
立意
作品视角
所属系列
文章进度
内容简介
作者简介
目录
文摘
安全警示 适度休息有益身心健康,请勿长期沉迷于阅读小说。
随便看

 

兰台网图书档案馆全面收录古今中外各种图书,详细介绍图书的基本信息及目录、摘要等图书资料。

 

Copyright © 2004-2025 xlantai.com All Rights Reserved
更新时间:2025/5/17 15:01:54