|Ph.D Student||Lina Teper|
|Subject||Optimal Calibration of Measurement Systems and Precision|
Enhancement in Input Restoration Using Apriori
|Department||Department of Quality Assurance and Reliability||Supervisors||Dr. Phineas Dickstein|
|Professor Ingman Dov|
|Full Thesis text|
The volume of data accumulation and transfer has been growing significantly in recent years. Many applications depend critically on the accuracy of the data, and in certain cases reliance on unrealistic inputs can have grave safety and economic consequences. Sophisticated data-processing and signal-analysis techniques have been developed for extracting the definitive data from experimental results contaminated by noise or from responses distorted in the course of the measurement or transfer process. One major challenge is to provide the most adequate calibration procedures which would take into account the experimental noise, the statistical characteristics of the input (if available) and the transfer function of the measurement set-up.
This study deals with two main issues. The first (“direct problem”) is choice of training inputs so as to obtain the optimal calibration curve, i.e. the optimal distribution of input-measurement points with a view to minimal uncertainty. The second issue (“inverse problem”) is optimal restoration of input data points, given the response of the measurement system and a-priori information about the nature of the input data. The general methodologies for the direct and inverse problems are intended for all applications in the metrology field.
An objective of this study is design of a polynomial regression function in a space of two or more variables. Construction of a regression function of order m in an n-space entails optimized choice of a set of points at which probing will take place. Several general principles emerge, whereby choice of the number and location of the basic points can be simplified.
Determination of the transfer function
of a measurement set-up is also essential for calibration purposes. Suppose a
measurement system is to be widely used for the examination of a large number
of specimens. Here the transfer function is of outmost importance, since the
"ideal" data can then be extracted through a deconvolution procedure
which may be less complicated if realized in the frequency domain.
Following the theoretical work, experiments were planned and conducted on characterization of the transfer function of measurement systems in the areas of ultrasonics and industrial radiography.