Thông tin chung

  English

  Đề tài NC khoa học
  Bài báo, báo cáo khoa học
  Hướng dẫn Sau đại học
  Sách và giáo trình
  Các học phần và môn giảng dạy
  Giải thưởng khoa học, Phát minh, sáng chế
  Khen thưởng
  Thông tin khác

  Tài liệu tham khảo

  Hiệu chỉnh

 
Số người truy cập: 106,833,178

 A Talking Robot and Its Real-time Interactive Modification for Speech Clarification
Tác giả hoặc Nhóm tác giả: Thanh VO NHU and Hideyuki SAWADA
Nơi đăng: SICE Journal of Control, Measurement, and System Integration, IET Inspec index; Số: 10;Từ->đến trang: 251-256;Năm: 2016
Lĩnh vực: Khoa học công nghệ; Loại: Bài báo khoa học; Thể loại: Quốc tế
TÓM TẮT
The authors are developing a talking robot based on the physical model of human vocal organs in order to reproduce human speech mechanically. This study focuses on developing a real-time interface to control and visualize talking behavior. The talking robot has been trained using a self-organizing map (SOM) to reproduce human sounds; however, due to the nonlinear characteristics of sound dynamics, automatic generation of human-like expressive speech is difficult. It is important to visualize its performance and manually adjust the motions of the artificial vocal system to get a better result, especially when it learns to vocalize new language. Therefore, a real- time interactive control for the talkingrobot is designed and developed to fulfill this task. A novel formula about the formant frequency change due to vocal tract motor movements is derived from acoustic resonance theory. In the first part of the paper, the construction of the talking robot is briefly described, followed by the real-time interaction system using Matlab Graphic User Interface (GUI) together with the strategy to interactively modify the speech articulation based on the formant frequency comparison.
ABSTRACT
The authors are developing a talking robot based on the physical model of human vocal organs in order to reproduce human speech mechanically. This study focuses on developing a real-time interface to control and visualize talking behavior. The talking robot has been trained using a self-organizing map (SOM) to reproduce human sounds; however, due to the nonlinear characteristics of sound dynamics, automatic generation of human-like expressive speech is difficult. It is important to visualize its performance and manually adjust the motions of the artificial vocal system to get a better result, especially when it learns to vocalize new language. Therefore, a real- time interactive control for the talkingrobot is designed and developed to fulfill this task. A novel formula about the formant frequency change due to vocal tract motor movements is derived from acoustic resonance theory. In the first part of the paper, the construction of the talking robot is briefly described, followed by the real-time interaction system using Matlab Graphic User Interface (GUI) together with the strategy to interactively modify the speech articulation based on the formant frequency comparison.
© Đại học Đà Nẵng
 
 
Địa chỉ: 41 Lê Duẩn Thành phố Đà Nẵng
Điện thoại: (84) 0236 3822 041 ; Email: dhdn@ac.udn.vn