top of page

_MG_9358.JPG

3314 Signals and Systems

  • Course Time: Mon 09:10-12:00

  • Classroom: AT338

  • Course Outlines: This course covers fundamental concepts in signals and systems, which are broadly used for modeling, analyzing, and designing physical processes. Both the continuous-time and the discrete-time aspects will be considered. This course can be viewed as the prerequisite for the advanced courses such as linear systems, communication systems and digital signal processing.  The course is outlined as:

    • 1. Introduction to signals and systems;

    • 2. Linear time-invariant systems;

    • 3. Fourier series representation for periodical signals;

    • 4. Fourier analysis for continuous-time signals and systems;

    • 5. Fourier analysis for discrete-time signals and systems;

    • 6. Sampling;

    • 7. Filtering;

    • 8. Laplace transform and Z-transform. 

  • Textbook: "Signals and Systems", by A. Oppenheim, A. Willsky, and H. Nawab, 2nd Edition, Prentice Hall, 1997.

  • Lecture Notes:  Chapter0,

  • Grade:  

  • News:

    • ​HW1: Programming Assignment: Implement Convolution. Write a program to implement convolution from scratch without using built-in convolution functions. Use Examples 2.3 and 2.4 to test and verify your implementation. Your program should include kernel flipping, sliding, element-wise multiplication, and summation, and clearly present the final results.

    • ​4/20 0930-1130 期中考。 

    • 5/11 小考,範圍第三章。

    • 6/8 0930-1130 期末考。 

6654 Pattern Recognition

  • Course Time: Tue 13:10-16:00

  • Classroom: AT338

  • Course Outlines: 

    • 1.Introduction;

    • 2.Bayes Decision Theory;

    • 3.Maximum-Likelihood and Bayesian Parameter Estimation;

    • 4.Nonparametric Techniques;

    • 5.Multilayer Neural Networks;

    • 6. Deep Learning - Convolutional Neural Networks; 

    • 7. Unsupervised Learning and Clustering.

    • 8. Feature Extraction - Linear Discriminant Analysis and Principle Component Analysis;

    • 9. Deep Learning - Autoencoder; 

  • Textbook: "Pattern Classification", by Richard O. Duda, Peter E. Hart and David G. Stork, John Wiley & Sons, 2nd edition, 2001.

  • Reference Books:  

    • Introduction to Statistical Pattern Recognition, by Keinosuke Fukunaga, 2nd Edition, Academic Press, 1990.

    • Neural Networks and Learning Machines, 3rd Edition, Simon O. Haykin, McMaster University, Ontario Canada, Pearson, 2009.

    • "Artificial Intelilgence" by Leonardo Araujo dos Santos. 2018.

    • "Deep Learning", by Ian Goodfellow, Yoshua Bengio, and Aaron Courville,  MIT Press, 2016.

  • Lecture Notes:  PR00

  • Grade:   

  • News: 

    • HW1: Generate a high-dimensional Gaussian dataset based on a stationary random process. The deadline is 12th May.​

​​

 

 

 

  • HW2: Design a Maximum A Posteriori (MAP) classifier for both the dataset in HW1 and the Wine dataset from the UCI Machine Learning Repository. Use half of the data for training and the remaining half for evaluation.​​ The deadline is 12th May.​​​

  • HW3: Compute the upper bound of Bayes error (Bhattacharyya Bound) for HW1 and the UCI-WINE dataset. The deadline is 12th May.​​ You should include:

    • A brief explanation of the Bayes error and the Bhattacharyya Bound.

    • The mathematical formulation used in the computation.

    • The calculation procedure for both datasets.

    • The estimated upper bound of the Bayes error.

    • A short discussion comparing the results with those obtained in HW2.

  • 作業繳交方式:內容為主要演算法與程式片斷,包含測試資料與結果,以及簡短的討論或結論等。請把書面報告電子檔,程式及相關測試資料先以Winzip或WinRAR壓縮,並以學號為其檔名,上傳到指定的FTP網站,網址隨後公布。

  • HW4: Implement a multilayer perceptron (MLP) classifier with three hidden layers for the dataset in HW1 and the UCI Wine dataset. Evaluate the classification performance and compare the results with those obtained in HW2. You should include:

    • The MLP network architecture, including the number of neurons in each hidden layer.

    • The training and testing procedure.

    • Performance metrics such as accuracy, precision, recall, and F1-score.

    • A comparison table between the results of HW2 and HW4.

    • A short discussion explaining whether the MLP improves the classification performance and why.

  • 期末報告每人1組,報告論文須為近5年的論文,論文題目摘要請於5/23交,題目內容須用到本課程所涵蓋的分類器(classifiers),含深度學習。

  • 5/26 期末考筆試。

  • 6/2,6/9期末分組報告​。 

  • HW5: Implement a k-Nearest Neighbor (kNN) classifier and the K-Means clustering method for both the dataset used in HW1 and the UCI Wine dataset. Evaluate the classification performance of the kNN classifier and analyze the clustering results obtained from K-Means. In your report, compare the results with those obtained in HW2 and HW4, and provide a short discussion on the performance differences among the methods. Assignment Requirements:

    • Implement a kNN classifier for the HW1 dataset and the UCI Wine dataset.

    • Implement the K-Means clustering algorithm for the same two datasets.

    • Evaluate the classification performance using appropriate metrics, such as accuracy, confusion matrix, precision, recall, or F1-score.

    • Analyze and discuss the clustering results of K-Means.

    • Compare your results with those obtained in HW2 and HW4.

    • Provide a short discussion explaining the differences in performance among the methods.

PReq1.png
siplab.png

Signal and Image Processing Lab

National Chung Hsing University

Copyright © 2026 by Jiunn-Lin Wu. Created with Wix.com

  • Facebook Clean Grey
  • Twitter Clean Grey
  • LinkedIn Clean Grey
bottom of page