Language:
English
簡体中文
繁體中文
Help
Login
Create an account
Back
Switch To:
Labeled
|
MARC Mode
|
ISBD
Latent Semantic Mapping = Principles and Applications /
Record Type:
Electronic resources : Monograph/item
Title/Author:
Latent Semantic Mapping/ by Jerome R. Bellegarda.
Reminder of title:
Principles and Applications /
Author:
Bellegarda, Jerome R.
Description:
X, 101 p.online resource. :
Contained By:
Springer Nature eBook
Subject:
Electrical engineering. -
Online resource:
Fulltext (查閱電子書全文)
ISBN:
9783031025563
Latent Semantic Mapping = Principles and Applications /
Bellegarda, Jerome R.
Latent Semantic Mapping
Principles and Applications /[electronic resource] :by Jerome R. Bellegarda. - 1st ed. 2007. - X, 101 p.online resource. - Synthesis Lectures on Speech and Audio Processing,1932-1678. - Synthesis Lectures on Speech and Audio Processing,.
Contents: I. Principles -- Introduction -- Latent Semantic Mapping -- LSM Feature Space -- Computational Effort -- Probabilistic Extensions -- II. Applications -- Junk E-mail Filtering -- Semantic Classification -- Language Modeling -- Pronunciation Modeling -- Speaker Verification -- TTS Unit Selection -- III. Perspectives -- Discussion -- Conclusion -- Bibliography.
Latent semantic mapping (LSM) is a generalization of latent semantic analysis (LSA), a paradigm originally developed to capture hidden word patterns in a text document corpus. In information retrieval, LSA enables retrieval on the basis of conceptual content, instead of merely matching words between queries and documents. It operates under the assumption that there is some latent semantic structure in the data, which is partially obscured by the randomness of word choice with respect to retrieval. Algebraic and/or statistical techniques are brought to bear to estimate this structure and get rid of the obscuring ""noise."" This results in a parsimonious continuous parameter description of words and documents, which then replaces the original parameterization in indexing and retrieval. This approach exhibits three main characteristics: -Discrete entities (words and documents) are mapped onto a continuous vector space; -This mapping is determined by global correlation patterns; and -Dimensionality reduction is an integral part of the process. Such fairly generic properties are advantageous in a variety of different contexts, which motivates a broader interpretation of the underlying paradigm. The outcome (LSM) is a data-driven framework for modeling meaningful global relationships implicit in large volumes of (not necessarily textual) data. This monograph gives a general overview of the framework, and underscores the multifaceted benefits it can bring to a number of problems in natural language understanding and spoken language processing. It concludes with a discussion of the inherent tradeoffs associated with the approach, and some perspectives on its general applicability to data-driven information extraction. Contents: I. Principles / Introduction / Latent Semantic Mapping / LSM Feature Space / Computational Effort / Probabilistic Extensions / II. Applications / Junk E-mail Filtering / Semantic Classification / Language Modeling / Pronunciation Modeling / Speaker Verification / TTS Unit Selection / III. Perspectives / Discussion / Conclusion / Bibliography.
ISBN: 9783031025563
Standard No.: 10.1007/978-3-031-02556-3doiSubjects--Topical Terms:
423914
Electrical engineering.
LC Class. No.: TK1-9971
Dewey Class. No.: 621.3
Latent Semantic Mapping = Principles and Applications /
LDR
:03778nmm a22003735i 4500
001
349685
003
DE-He213
005
20220601151017.0
007
cr nn 008mamaa
008
230512s2007 sz | s |||| 0|eng d
020
$a
9783031025563
$9
978-3-031-02556-3
024
7
$a
10.1007/978-3-031-02556-3
$2
doi
035
$a
978-3-031-02556-3
050
4
$a
TK1-9971
072
7
$a
THR
$2
bicssc
072
7
$a
TEC007000
$2
bisacsh
072
7
$a
THR
$2
thema
082
0 4
$a
621.3
$2
23
100
1
$a
Bellegarda, Jerome R.
$e
author.
$4
aut
$4
http://id.loc.gov/vocabulary/relators/aut
$3
424104
245
1 0
$a
Latent Semantic Mapping
$h
[electronic resource] :
$b
Principles and Applications /
$c
by Jerome R. Bellegarda.
250
$a
1st ed. 2007.
264
1
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2007.
300
$a
X, 101 p.
$b
online resource.
336
$a
text
$b
txt
$2
rdacontent
337
$a
computer
$b
c
$2
rdamedia
338
$a
online resource
$b
cr
$2
rdacarrier
347
$a
text file
$b
PDF
$2
rda
490
1
$a
Synthesis Lectures on Speech and Audio Processing,
$x
1932-1678
505
0
$a
Contents: I. Principles -- Introduction -- Latent Semantic Mapping -- LSM Feature Space -- Computational Effort -- Probabilistic Extensions -- II. Applications -- Junk E-mail Filtering -- Semantic Classification -- Language Modeling -- Pronunciation Modeling -- Speaker Verification -- TTS Unit Selection -- III. Perspectives -- Discussion -- Conclusion -- Bibliography.
520
$a
Latent semantic mapping (LSM) is a generalization of latent semantic analysis (LSA), a paradigm originally developed to capture hidden word patterns in a text document corpus. In information retrieval, LSA enables retrieval on the basis of conceptual content, instead of merely matching words between queries and documents. It operates under the assumption that there is some latent semantic structure in the data, which is partially obscured by the randomness of word choice with respect to retrieval. Algebraic and/or statistical techniques are brought to bear to estimate this structure and get rid of the obscuring ""noise."" This results in a parsimonious continuous parameter description of words and documents, which then replaces the original parameterization in indexing and retrieval. This approach exhibits three main characteristics: -Discrete entities (words and documents) are mapped onto a continuous vector space; -This mapping is determined by global correlation patterns; and -Dimensionality reduction is an integral part of the process. Such fairly generic properties are advantageous in a variety of different contexts, which motivates a broader interpretation of the underlying paradigm. The outcome (LSM) is a data-driven framework for modeling meaningful global relationships implicit in large volumes of (not necessarily textual) data. This monograph gives a general overview of the framework, and underscores the multifaceted benefits it can bring to a number of problems in natural language understanding and spoken language processing. It concludes with a discussion of the inherent tradeoffs associated with the approach, and some perspectives on its general applicability to data-driven information extraction. Contents: I. Principles / Introduction / Latent Semantic Mapping / LSM Feature Space / Computational Effort / Probabilistic Extensions / II. Applications / Junk E-mail Filtering / Semantic Classification / Language Modeling / Pronunciation Modeling / Speaker Verification / TTS Unit Selection / III. Perspectives / Discussion / Conclusion / Bibliography.
650
0
$a
Electrical engineering.
$3
423914
650
0
$a
Signal processing.
$3
423975
650
0
$a
Acoustical engineering.
$3
424101
650
1 4
$a
Electrical and Electronic Engineering.
$3
423916
650
2 4
$a
Signal, Speech and Image Processing .
$3
423976
650
2 4
$a
Engineering Acoustics.
$3
424102
710
2
$a
SpringerLink (Online service)
$3
423502
773
0
$t
Springer Nature eBook
776
0 8
$i
Printed edition:
$z
9783031014284
776
0 8
$i
Printed edition:
$z
9783031036842
830
0
$a
Synthesis Lectures on Speech and Audio Processing,
$x
1932-1678
$3
424100
856
4 0
$u
https://doi.org/10.1007/978-3-031-02556-3
$z
Fulltext (查閱電子書全文)
912
$a
ZDB-2-SXSC
950
$a
Synthesis Collection of Technology (R0) (SpringerNature-85007)
based on 0 review(s)
Reviews
Add a review
and share your thoughts with other readers
Export
pickup library
Processing
...
Change password
Login