注冊 | 登錄讀書好,好讀書,讀好書!
讀書網(wǎng)-DuShu.com
當(dāng)前位置: 首頁出版圖書科學(xué)技術(shù)計(jì)算機(jī)/網(wǎng)絡(luò)人工智能基于人工神經(jīng)網(wǎng)絡(luò)的機(jī)器翻譯

基于人工神經(jīng)網(wǎng)絡(luò)的機(jī)器翻譯

基于人工神經(jīng)網(wǎng)絡(luò)的機(jī)器翻譯

定 價(jià):¥25.00

作 者: 許羅邁
出版社: 科學(xué)出版社
叢編項(xiàng):
標(biāo) 簽: 自動(dòng)化基礎(chǔ)理論

ISBN: 9787030189813 出版時(shí)間: 2007-06-01 包裝: 平裝
開本: 0開 頁數(shù): 216 字?jǐn)?shù):  

內(nèi)容簡介

  《基于人工神經(jīng)網(wǎng)絡(luò)的機(jī)器翻譯》基于語料庫統(tǒng)計(jì)的機(jī)器翻譯模式把機(jī)器翻譯分為翻譯模式和語言模式兩種處理過程,作者嘗試把人工神經(jīng)網(wǎng)絡(luò)技術(shù)應(yīng)用于兩種模式的處理,使之應(yīng)用于機(jī)器翻譯的全過程,是一項(xiàng)創(chuàng)造性工作,作者采用神經(jīng)元自學(xué)習(xí)的方法,從少量實(shí)例開始,系統(tǒng)通過自學(xué)習(xí)建立機(jī)器詞庫和對應(yīng)的譯文,本研究實(shí)驗(yàn)證明對于確定的領(lǐng)域,該系統(tǒng)可以輸出相當(dāng)通順的目的語,這種用分布式神經(jīng)網(wǎng)絡(luò)體系解決翻譯模式的訓(xùn)練,較好地解決了單一網(wǎng)絡(luò)學(xué)習(xí)能力有限的問題,對神經(jīng)網(wǎng)絡(luò)語言處理技術(shù)開發(fā)了新思路,有相當(dāng)意義。作者在應(yīng)用神經(jīng)網(wǎng)絡(luò)處理語言模式方面,也提出了新的解決方案,改變了以往神經(jīng)網(wǎng)絡(luò)以復(fù)雜句法、語義特征為訓(xùn)練對象的普遍做法,采用詞性標(biāo)注為訓(xùn)練對象,以自創(chuàng)的一套詞語移動(dòng)符號基為訓(xùn)練目標(biāo)的神經(jīng)網(wǎng)絡(luò)處理方法,是一種獨(dú)特的處理方法,雖然作者指出這種方法未能得到預(yù)期的結(jié)果,但是如果能夠如作者提出的把分布式神經(jīng)網(wǎng)絡(luò)體系也用于語言模式的訓(xùn)練,這種獨(dú)特的方法成敗與否還未可知。

作者簡介

暫缺《基于人工神經(jīng)網(wǎng)絡(luò)的機(jī)器翻譯》作者簡介

圖書目錄

Preface
Acknowledgements
Chapter One Prologue
Chapter Two MT state of the art
 2.1 MT as symbolic systems
 2.2 Practical MT
 2.3 Alternative technique of MT
  2.3.1 Theoretical foundation
  2.3.2 Translation model
  2.3.3 Language model
 2.4 Discussion
Chapter Three Connectionist solutions
 3.1 NLP models
 3.2 Representation
 3.3 Phonological processing
 3.4 Learning verb past tense
 3.5 Part of speech tagging
 3.6 Chinese collocation learning
 3.7 Syntactic parsing
  3.7.1 Learning active/passive transformation
  3.7.2 Confluent preorder parsing
  3.7.3 Parsing with fiat structures
  3.7.4 Parsing embedded clauses
  3.7.5 Parsing with deeper structures
 3.8 Discourse analysis
  3.8.1 Story gestalt and text understanding
  3.8.2 Processing stories with scriptural knowledge
 3.9 Machine translation
 3.10 Conclusion
Chapter Four NeuroTrans design considerations
 4.1 Scalability and extensibility
 4.2 Transfer or inter lingual
 4.3 Hybrid or fully connectionist
 4.4 The use of linguistic knowledge
 4.5 Translation as a two stage process
 4.6 Selection of network models
 4.7 Connectionist implementation
 4.8 Connectionist representation issues
 4.9 Conclusion
Chapter Five A neural lexicon model
 5.1 Language data
 5.2 Knowledge representation
  5.2.1 Symbolic approach
  5.2.2 The statistical approach
  5.2.3 Connectionist approach
  5.2.4 NeuroTrans' input/output representation
  5.2.5 NeuroTrans' lexicon representation
 5.3 Implementing the neural lexicon
  5.3.1 Words in context
  5.3.2 Context with weights
  5.3.3 Details of algorithm
  5.3.4 The Neural Lexicon Builder
 5.4 Training
  5.4.1 Sample preparation
  5.4.2 Training results
  5.4.3 Generalization test
 5.5 Discussion
  5.5.1 Adequacy
  5.5.2 Scalability and Extensibility
  5.5.3 Efficiency
  5.5.4 Weaknesses
Chapter Six Implementing the language model
 6.1 Overview
 6.2 Design
  6.2.1 Redefining the generation problem
  6.2.2 Defining jumble activity
  6.2.3 Language model structure
 6.3 Implementation
  6.3.1 Network structure Sampling Training and results
  6.3.2 Generalization test
 6.4 Discussion
  6.4.1 Insufficient data
  6.4.2 Information richness
  6.4.3 Insufficient contextual information
  6.4.4 Distributed language model
Chapter Seven Conclusion
Chapter Eight References
Index

本目錄推薦

掃描二維碼
Copyright ? 讀書網(wǎng) m.ranfinancial.com 2005-2020, All Rights Reserved.
鄂ICP備15019699號 鄂公網(wǎng)安備 42010302001612號