Last Update: November 30, 2011  
Call for Papers | Call for Special Sessions | Call for Workshop | Call for Tutorial & Participant | Paper Submissions

1 Malaysia    HIS 2011

    

» HOME » CALL FOR PAPERS » IMPORTANT DATES » PAPER SUBMISSIONS » CAMERA-READY GUIDELINES » INVITED SPEAKERS » TECHNICAL PROGRAM » DOWNLOAD » COMMITTEE » SPONSORSHIPS » REGISTRATION » CONFERENCE VENUE » HOW TO GET TO MELAKA » ACCOMMODATION » SOCIAL PROGRAMS » LOCAL INFORMATION » CONTACT US

JOURNAL SPECIAL ISSUES

Related Conference:

IAS 2011

I N V I T E D    S P E A K E R S :: ____________________________________________________
Keynote Speakers || Tutorial Speakers||

 

Tutorial Speakers :: ____________________________________________________
Hideyuki Takagi||Ponnuthurai Nagaratnam Suganthan||Ajith Abraham||Pieter Hartel||Mohd Shazri Shahrir

 

Mohammad Shazri Shahrir, Extol MSC Berhad, Malaysia

Mohammad Shazri Shahrir is a research and development engineer currently leading the AI research cluster at Extol MSC Berhad, Malaysia. He has worked extensively on the artificial intelligent research projects particularly in pattern recognition. His work has included detailed analysis, design, augmentation and prototyping of the solutions in Quantum Cryptography, Signature verification, face recognition and verification, time tabling, etc.  Shazri’s research interests include mathematical and machine learning theories and implementation such as Neural Networks, SVM, ARTMAP, AdaBoost, Bayesian Believe Networks, Principal Component Analysis, Fisher-Face , Eigen-face,  Laplacian Face, Mahalonobis Distance, Solving Eigen-Problem, Quadratic Programming, Inverse of Arbitrarily Sized Symmetric Matrices, Primal-Dual Simplex, Integer Programming,Time-Tabling Problem, Image Manipulation, Discrete Fourier Transform, Wavelet Transform, etc.

 

Abstract

Bayesian Neural Network
Regularization imposes stability on an ill-posed problem, the equation is ill-posed if it is not well-posed.  The effect of regularization is the output smoothing of the neural network. With an attempt of smoothing, it improves the generalization of neural network by a mechanism to balance between how well the function fits the data and how smooth it is (penalizing using Bayesian prior).  There are many approaches in regularization, namely Tikhonov  Regularization, Iterative Regularization, etc. In this workshop, we will explore the Bayesian Regularization which imposes prior distributions on model parameters.  The model parameters involved are the weights of the Neural Network.