Collections

物证鉴定范式发展
Sort by Default Latest Most read  
Please wait a minute...
  • Select all
    |
  • Research Articles
    WANG Guiqiang
    Forensic Science and Technology. 2025, 50(3): 221-234. https://doi.org/10.16467/j.1008-3650.2024.0033

    The forensic science is undergoing a paradigm shift from the traditional paradigm to the features LR paradigm and the similarity scores LR paradigm, in an era of parallel three paradigms. Given the advantages and development opportunities brought by Bayesian LR paradigm, paradigm shift has become a major trend in forensic science in the global. However, paradigm shifts have not yet been fully realized on a global scale, and the development of the forensic paradigm is imbalance in various forensic discipline and in different countries and regions. The main reasons that hinder the paradigm shift of forensic science include limitation of the technology methods of new paradigm, misconceptions of some personnel about the new paradigms, insufficient abilities related the new paradigms, and legal application issues. Except for DNA evidence identification, the application of new paradigms and paradigm shift in China are relatively lagging behind. This article proposes an implementation path for the paradigm shift of forensic science in China, including scholars and practitioners conducting scientific research on new paradigms, regulatory authorities making paradigm shift strategies and plans, forensic lab developing and confirming LR methods, developing LR verbal scales, collecting data, training examiner and taking proficiency tests, decision-makers receiving relevant education, and legislators adjusting relevant regulations.

  • Research Articles
    WANG Guiqiang
    Forensic Science and Technology. 2025, 50(2): 111-123. https://doi.org/10.16467/j.1008-3650.2024.0032

    The likelihood ratio paradigm of facial similarity score is the theory and method for interpreting the evidential significance of score finding from facial comparison. Facial similarity score likelihood ratio is the latest method of Bayesian likelihood ratio paradigm for forensic science. The likelihood ratio (LR) of facial similarity score is the ratio of the occurring probability of the facial score finding quantitatively assigned based on the probability distribution data of facial scores, under a pair of conflicting propositions that usually represent the claims of the prosecution and defense parties. The propositions typically deal with the question of whether a facial image with unknown identity collected at a crime scene comes from a suspect with known identity. The face score LR expresses the relative support direction and strength of the face score finding for the propositions of the prosecution and defense parties, providing quantitative evidence value for decision-makers to determine the disputed fact of the face source. The decision maker determines the fact of the facial source proposition based on the facial score LR opinion, or the posterior probability of the facial source proposition derived from the LR and the prior odds through Bayesian law, combined with other evidence, to exclude reasonable doubt. The likelihood ratio paradigm of facial similarity score is completely different from the traditional paradigm we are accustomed to in terms of scientific logic, opinion formation, expression, understanding, and reasoning applications. It also differs from the widely used LR paradigm of DNA feature findings, which poses new requirements and challenges for forensic examiner and decision-makers.

  • Forensic Science and Technology. https://doi.org/0.16467/j.1008-3650.2025.0016
    Online available: 2025-03-10
    贝叶斯定理被许多讨论法庭科学比对方法的文献当作最基础的理论。本文对此开展深入分析,探讨了在法庭科学中使用这一定理面临的问题,尤其是详细区分了定理中不同表示项的意义及逻辑推理难题,认为贝叶斯定理并不能为特征比对得出结论提供有意义支撑,其只是一个计算工具。如果将法庭科学比对方法建立在贝叶斯定理之上,理论基础并不牢固。在法庭科学方法面临变革的时期,更需要在理论上正本清源。
  • Research Articles
    WANG Guiqiang
    Forensic Science and Technology. 2025, 50(1): 21-32. https://doi.org/10.16467/j.1008-3650.2024.0031

    The subjective likelihood ratio paradigm of pattern features is the theory and method for the interpretation of evidence significance of feature findings of pattern evidence. Subjective likelihood ratio (LR) of pattern features is the probability ratio of occurrence of pattern features assigned under the two opposing propositions representing the prosecution and defense respectively, based on expert knowledge or the combination of expert knowledge and data. The proposition hierarchy for evaluating the subjective LR of pattern features includes source level and active level. Subjective LR of pattern features expresses the relative support direction and intensity of the pattern features findings for the propositions of the prosecution and defense, providing qualitative evidence value for decision makers to determine disputed factual propositions. Decision makers will determine the propositions facts based on subjective LR opinions of pattern features, or based on the posterior probabilities of the propositions derived from LR through Bayes’ theorem, combined with other evidence in the case, in the way to exclude reasonable doubts. The subjective LR paradigm of pattern features differs significantly from the traditional paradigm we are accustomed to in terms of scientific logic and the formation, expression, understanding, and reasoning application of opinions, and it also differs from the objective LR paradigm of DNA feature. This presents new requirements and challenges for forensic examiner and decision-makers.

  • Research Articles
    WANG Guiqiang
    Forensic Science and Technology. 2024, 49(6): 551-565. https://doi.org/10.16467/j.1008-3650.2024.0030

    The objective likelihood ratio (LR) paradigm of DNA feature is the theory and method for interpreting the evidential significance of DNA analysis findings. It is a typical representative of the forensic Bayesian likelihood ratio paradigm. The DNA feature’s objective LR is the ratio of the probability of DNA feature findings quantitatively assigned based on model and data, in the context of two alternative propositions typically advocated by both of the prosecution and defense. The hierarchy of propositions includes the sub-sub source level, sub-sources level, sources level, and activity level. The LR of DNA feature findings expresses the relative support direction and strength of the DNA findings for the propositions of the prosecution and defense, providing quantitative evidence value for decision-makers to determine disputed propositional facts. Decision makers will determine the propositional facts without reasonable doubt, based on the LR opinions of DNA findings or the posterior probability of the propositions derived from LR opinions and Bayesian laws, and combined with other evidence in the case. The DNA feature’s objective LR paradigm is completely different from the traditional paradigm that we are used to in terms of scientific logic, opinion formation, expression, understanding, and reasoning applications, which poses new requirements and challenges for forensic examiner and decision-makers.

  • Research Articles
    LI Kang, CHEN Shitao, LUO Yaping
    Forensic Science and Technology. 2024, 49(6): 566-573. https://doi.org/10.16467/j.1008-3650.2024.0003

    The establishment of a scientific quantitative evaluation system for fingerprint evidence, especially how to introduce the statistical method of likelihood ratio into the digital representation of fingerprint identification, is a hot issue in the current theoretical and practical research of forensic science. The construction of a scientific and effective likelihood ratio evaluation model for fingerprint evidence requires rich same-source and different-source fingerprint databases to obtain the likelihood function with a stable distribution law, thus, the quality of the same-source and different-source databases directly affects the performance of the likelihood ratio model. By using the live-scan fingerprint collector and screen recording software to obtain more than 1 000 distorted fingerprint images for each fingerprint in different distortion modes, a total of 200 000 same-source fingerprints are obtained from 200 simulated fingerprints, which constitutes the same-source fingerprint database; and the different-source fingerprint database consists of ten million people’s ten-fingerprint database in policing practice. On this basis, the automatic fingerprint identification system is utilized for query and comparison, and the comparison score data are evaluated. The experimental results show that the fingerprint data of different distortion modes have significant differences; the degree of pressure and the impressing time have little effect on the comparison scores of fingerprints. From the results of statistical analysis of the total number of samples and the subsample data after different degrees of reduction, it can be seen that the number of same-source samples of each fingerprint can still form a stable distribution law when the number of fingerprints is as few as 155. Therefore, the database we built is rich in the number of same-source and different-source fingerprints, reasonable in structure, and has the data basis for forming a stable distribution law, which is helpful for the subsequent establishment of the likelihood ratio evaluation model.

  • Research Articles
    WANG Guiqiang
    Forensic Science and Technology. 2024, 49(5): 441-455. https://doi.org/10.16467/j.1008-3650.2024.0029

    The interpretation paradigm of forensic findings is undergoing a shift from the traditional paradigm of categorical conclusions to the likelihood ratio paradigm of forensic findings. The forensic likelihood ratio paradigm, DNA evidence as a typical one, has improved the validity of forensic opinions and transformed the reasoning and proof mode of forensic opinions. The forensic likelihood ratio paradigm is based on the Bayesian likelihood ratio framework. The likelihood ratio (LR) of the forensic findings is assigned under a pair of opposing source or active propositions advocated by the prosecution and defense, and the LR is used as the evidence value of forensic findings relatively supporting the propositions of both prosecution and defense parties. It provides LR opinions of forensic findings to help decision-makers infer and determine disputed propositional facts on the source or active level. The forensic likelihood ratio paradigm is completely different from traditional paradigms in terms of scientific basis, opinion formation, expression, understanding, and reasoning application, which poses new requirements and huge challenges for forensic examiner and decision-makers in criminal proceedings.

  • Research Articles
    WANG Guiqiang
    Forensic Science and Technology. 2024, 49(4): 331-339. https://doi.org/10.16467/j.1008-3650.2024.0028

    The forensic paradigm is the scientific theories and methods used in the process of interpreting the findings of forensic examination and forming expert opinion. There is a paradigm shift from the traditional paradigm of categorical source conclusions to the paradigm of evaluative opinion. The traditional forensic paradigm is based on the assumption of feature uniqueness. The traditional paradigm of forensic science has a history of over 100 years of development and application, and has been applied to almost all physical evidence except DNA evidence. After detecting and comparing trace evidence from crime scene and known source sample, examiner will determine whether the features of the trace evidence match features of the sample, and will use threshold decision-making to give opinions on the trace evidence and the sample came from a same source or from different sources. In the traditional paradigm, the process by which examiner forms a categorical source opinions from results of the features is a deductive reasoning process: the major premise is the assumption of the uniqueness of trace features, the minor premise is the results of feature matching (or no-match), and the conclusion is that the trace and sample has same (or different) source. As long as the major and minor premises are true, the categorical opinion on source of the traditional paradigm is correct. However, with the development and maturity of evaluative methods for forensic DNA results, some scholars questioned the lack of empirical proof for the hypothesis of feature uniqueness in the traditional forensic paradigm, and thus believe that deductive reasoning without a major premise of the assumption of the trace features uniqueness has no validity, and therefore, the categorical source opinion in the traditional paradigm lacks a solid scientific foundation.

  • Forensic Science and Technology. https://doi.org/10.16467/j.1008-3650.2024.0030SF
    Online available: 2024-05-13
    DNA 特征客观似然比范式是关于 DNA 物证特征结果证据意义解释的理论和方法。它是物证鉴定贝叶 斯似然比范式的典型代表。DNA 特征结果客观似然比(LR)是在通常分别代表控辩双方主张的两个对立命题条 件下,基于模型和数据定量分配的 DNA 特征结果出现概率的比值。命题层级包括亚–亚来源、亚来源、斑痕来 源和行为层级。DNA 特征结果 LR 表达了 DNA 结果对控辩双方命题的相对支持方向和强度,为决策者确定有 争议的命题事实提供了量化证据价值。决策者基于 DNA 结果 LR 意见,或基于 LR 经贝叶斯定律导出的命题后 概率,结合案件其他证据排除合理怀疑地确定命题事实。DNA 特征客观 LR 范式在科学逻辑以及意见的形成、 表述、理解和推理应用方面完全不同于人们已经习惯的传统范式,这给鉴定人和决策者提出了新的要求和挑战。
  • Topic: Physical and Chemical Inspection
    GUO Hongling, WANG Ping, HU Can, MEI Hongcheng, ZHENG Jili, LI Yajun, ZHU Jun, QUAN Yangke, WANG Guiqiang
    Forensic Science and Technology. 2023, 48(4): 355-363. https://doi.org/10.16467/j.1008-3650.2023.0010

    Glass is frequently encountered trace evidence in forensic investigations. Currently, the examination reports of glass can only present the physicochemical data or give comparison results qualitatively, lacking a quantitative evaluation index to draw conclusions. The likelihood ratio (LR) approach has been widely used in forensics to provide a measure of evidence value to evaluate the contribution of evidential physicochemical data all around the world. But in China, the quantitative evidence evaluation method has not yet been built yet for glass evidence, which poses judges and prosecutors difficulties to use glass evidence accurately in the trial. In order to provide a quantitative interpretation and evaluation of the evidential value of glass, an LR model was tried to be built and its performance was evaluated. A kernel density estimation using a Gaussian kernel was carried out to estimate the probability density function for 750 refractive index data of 150 different glass samples. An LR model was built based on the probability density function. Moreover, the histogram of log LR and false inclusion and false exclusion rates were used to evaluate the LR model using LR=1 as the threshold. The result showed the LR values of within-source pairwise comparisons were in the range of 6.58 to 204 500, and the between-source range is 0 to 0.68. Evaluation of the results revealed low misleading rates. Only one pair within-source comparison was found with LR<1 (expected to be>1). The false exclusion misleading rate was 0.67 %. There were 173 pairs of between-source comparisons among pairs were found with LR>1 (expected to be<1), leading to 1.55% false inclusion misleading rate. The likelihood ratio model built based on Gaussian kernel density estimation presents a satisfactory result for glass sample comparisons based on glass refractive index data, providing a practicable quantitative evaluation method for comparing glass samples.

  • Topic: fingerprint identification
    MA Rongliang, LIU huan, WU Chunsheng
    Forensic Science and Technology. 2023, 48(1): 1-9. https://doi.org/10.16467/j.1008-3650.2022.0049

    Fingerprint, one kind of the most important forensic evidence, is capable of having an individual identified. Therefore, it has played a crucial role in a police investigations and court litigation since it was admitted under jurisprudence. Regarding the verdict of fingerprint identification, there are currently only three propositions in China: recognition, exclusion, and inconclusiveness. Presumably, such handling roots its basis on the experience and practical situation of China's crime prevention and court processing hitherto, yet having caused abandonment or unusedness of amount-huge fingerprints collected of less than 8 minutiae from crime scenes due to their disqualification to the requirements of source threshold for fingerprint identification. However, these fingerprints are significant for police investigation and court processing. Thus, a probabilistic approach was described here with mathematic modeling to count minutiae by the related fingerprint image divided into fan-shaped sectors. Based on the statistics of 15 million fingerprint images, a function of probability density was fitted into fingerprint minutiae of all fan-shaped sectors and then modified under Bayesian Information Criteria, plus the addition of noises. Consequently, a probability was acquired towards identity recognition about a fingerprint under scrutiny. Through a trial of several examples, the results showed that the matching probability of fingerprint pairs was positively correlated to the quantity and stability of the analyzed minutiae yet negatively to the incidence of minutiae occurring inside the fan-shaped sectors. This study provided a novel attempt to rediscover the evidential value of those ‘useless fingerprints' displaying no sufficient details for identification but frequently found at crime scenes. Such an approach should be a crucial step for fingerprint identification from quality to quantity analysis, having significant potential to identify a criminal in combination with the quantifying applications of other forensic evidence.

  • Research Articles
    LI Zhihui, XIE Lanchi, LÜ You, WANG Guiqiang
    Forensic Science And Technology. 2022, 47(1): 24-34. https://doi.org/10.16467/j.1008-3650.2021.0116
    Objective The spatial-temporal track information is key to find the suspected target in video investigation for many cases, yet lacking a methodic basis on how to use such information in the stage of evidence preparation and court testimony. Thus, a probabilistic approach was here to try for the problem to solve with evidence transformation of the target’s track information in video investigation. Methods The characteristic features were specified for human body, clothes and transportation vehicle (motorcycle in this essay) to go past the surveillance cameras under typical process of video investigation so that the different features were estimated of their presence probability plus its upper limit. Accordingly, the probabilities were to express the course of suspect’s committing crime and escaping under multiple cameras, therewith having converted the probability-based track query from a graph-representing model to one Bayesian network whose characteristics were hence able to utilize to have the likelihood ratio estimated. Results The formula and approximate calculation method were given about the likelihood ratio for the concerned problem put under video investigation scene, along with 1) a feature-based spatial-temporal probability model plus analysis being established through decomposition of the proposed time segment/range and direction hypothesis; 2) one probabilistic analysis plus thought framework being proposed about the characteristics of clothes, human body and vehicle in video image; 3) the probability ratio (i.e., likelihood ratio) results being obtained with hypothetical conditions; and 4) the influence being given of hypothesis on probability calculation, meanwhile both the method of posterior ratio estimation and its limiting factors being discussed for practical application. With the condition of twelve commonly-seen features (body shape, coat color/texture/style, pants style/color, shoe style/color, hair length, motorcycle style/color/luggage style) observed under three consecutive cameras (fixed onto a simply non-divergent road), the same target appearing sequentially past the three cameras (all the 12 features showing identical) delivers such a likelihood ratio that equals to at least 8 exponential magnitudes of 10 (i.e., 108) than the target not same. Conclusions A quantitative reference model is unprecedentedly provided with the here-established theoretical analytic method for evidential applicability of video tracking. Why probability-based video tracking results can be used as evidence is explanatorily defined, therefore helpful to improve the evidential power of video tracking.
  • Technical Notes
    ZHANG Zhijie, LÜ Dejian
    Forensic Science And Technology. 2019, 44(6): 545-547. https://doi.org/10.16467/j.1008-3650.2019.06.015
    Objective To explore the random matching probability of single parent-child duo among simulative DNA database in order to reduce the occurrence of random matching into DNA database. Methods Based on the known allele frequencies, the selected STR-loci database was simulated by the R software packages: “DNAprofiles” and “DNAtools”. The random matching probability was calculated for the two profiles to share with at most one-allele difference at each locus from (false) single parent-child pair. Results With the increasing of STR loci in the simulative DNA database, the decrease was found of random matching probability from the (false) single parent-child duo. Conclusion In actual practice, more STR loci should be tested so as to reduce the false inclusion of single parent-child duo. Suggestively, 23 and more loci are better for both DNA database construction and high-accuracy paternity analysis.
  • Exchangeable Experience
    TAO Guilan, DONG Dagang
    Forensic Science And Technology. 2019, 44(6): 562-564. https://doi.org/10.16467/j.1008-3650.2019.06.020
    The fingerprints found at the scene often occur of deformation by the influential factors from, e.g., environment, variety of the objects that suspects contacted, intensity of the force exerted with the fingerprint owner, hence resulting in the feature points of the fingerprints to change their juxtapositions so that the automatic fingerprint identification system will have lower matching probability due to the incompetent query between the relevant fingerprints. If the deformed fingerprint can be rectified, their matching probability could be improved. Here, a deformed fingerprint extracted from scene was corrected by Photoshop_CS5, one image processing software. From comparison against the deformed fingerprint with and without correction through automatic fingerprint recognition system, the sample fingerprint was respectively matched to the candidate’s at 1st place forwarded from the one at 134th. Thus, rational correction of deformed fingerprints can truly improve the matching probability between fingerprints.
  • Research Articles
    LI Zhihui, XIE Lanchi, WANG Guiqiang, WANG Haiou, NIU Yong, XU Lei, YAN Yuwen, LI Zhigang, XU Xiaojing, HUANG Wei, ZHANG Ning, GUO Jingjing, HOU Xinyu
    Forensic Science And Technology. 2019, 44(1): 1-8. https://doi.org/10.16467/j.1008-3650.2019.01.001
    Feature-comparison is one of the core methods among forensic evidence test, almost being applied by every professional subject. The feature-comparison method, based on the statistical framework, is objective, thus becoming the on-going direction of forensic science. Facial feature comparison is explored in this paper. Through in-depth characteristic analysis of the current deep learning with face features, the facial feature comparison is carried out into relevant large-scale data, thereby having obtained the statistical distributions of facial feature comparison score by deep-learning. Accordingly, the facial comparison approach is proposed at the basis of features' deep-learning coupled into the model of score-based likelihood ratio under Bayesian framework. The experimental results are supportive for the facial feature comparison to apply, demonstrating one more enrichment of the methods about forensic feature comparison based on statistics.
  • Research Articles
    ZHANG Cuiling, TAN Tiejun
    Forensic Science And Technology. 2018, 43(4): 265-271. https://doi.org/10.16467/j.1008-3650.2018.04.002
    Bayesian statistical inference is currently being developed into an international standard logical framework for the evaluation of forensic evidence. It uses prior knowledge and evidence information to evaluate the occurring probability of an event (if a hypothesis is true) through statistical inference and probability estimation. For the evaluation of forensic evidence by Bayesian framework, the forensic expert will first make statistical evaluation on the ratio of the probabilities of obtaining evidence given two competitive hypotheses (prosecution and defense hypothesis)—likelihood ratio, quantitatively demonstrating the value of the evidence. Then, the court arbitrators will calculate the posterior odds of factual proof under the two hypotheses given the evidence from the likelihood ratio multiplied by prior odds. Finally, the arbitrators will update their prior beliefs using likelihood ratios, making the inference on the factual proof. This is not only the innovation of forensic science but also the advance in the admission and evaluation of forensic evidence, therefore being of high significance for driving the scientific application of forensic evidence and judicial civilization.
  • Forensic Science And Technology. https://doi.org/10.16467/j.1008-3650.2000.03.005
    引用概率理论中乘法法则将每个特征所出现概率(人数)进行运算,通过数字来说明个人书写习惯的总体特殊性.从而使结论更加令人信服.
  • Research Articles
    ZHOU Mi, WANG Jun
    Forensic Science And Technology. 2017, 42(1): 28-31. https://doi.org/10.16467/j.1008-3650.2017.01.006
    Objective To analyze the apparent component number from the mixed sample of forensic genetic evidence with stochastic simulation for exploring the inherent regularity on the evaluation of the component number of such mixed sample. Methods With stochastic simulation into ID (Identifiler) system, one million STR genotypes of mixed samples, owning 2-7 components, were engendered and analyzed. The distribution of cumulative probability of apparent component number (CPA) in the mixed samples of 2-7 components was calculated by the self-designed software into ID system. Probability formula of apparent component number of the mixed sample, comprising two actual components but showing one apparent component (PA2-1) on a single-locus, was derived along with the cumulative probability formula of apparent component number of mixed sample (CPA) for the multi-loci. The two formulas were then all empirically validated on the calculation values of formula and simulation experiments by correlation method through two sets of simulation experiments. Finally, the concepts were proposed on the probability of excluding mixture (PEM) and cumulative probability of excluding mixture (CPEM). The approximate calculation formula of CPEM was put forward, and tested with the CPEM asymptotic value in ID system. Results CPA distribution of mixed samples owning 2-7 components was calculated with stochastic simulation approach in ID system. Two formulas were all in accordance with the simulation experiments. The approximate value of CPEM was 1-1.23298×10-9 (0.999999998767) in ID system. Conclusion CPA distribution and the formulas built here have certain applicability for the evaluation of component number of mixed sample. The CPEM can be suggested as the appraisal indicator for distinguishing the sample of single component from that mixed, and the method for calculating the CPEM is able to apply into practice of forensic genetics.
  • Research Articles
    WANG Zhongdi, LIU Huan, WU Hao, XUE Jing
    Forensic Science And Technology. 2016, 41(6): 437-441. https://doi.org/10.16467/j.1008-3650.2016.06.002
    Objective Based on the probabilities of five types of fingerprint patterns, nine kinds of fingerprint minutiae and five sorts of feature combinations, a calculation model was to set up for representing the conclusion of fingerprint identification by the same probability between one test fingerprint and the sampled counterpart. Methods Five types of fingerprint patterns were defined along with the five sorts of features into which nine kinds of fingerprint minutiae were divided when the “connection” and “disconnection” were emerging among the first, second and third of the five-type features. Results Through more than twenty thousand persons’ ten-digit fingerprints collected nationwide in three separate years, the stable occurrence of probabilities was observed from the five-type fingerprint patterns and five-sort features. Over forty-four million persons’ ten-digit fingerprints selected in Guangdong, Jiangsu, Henan, Qinghai and Heilongjiang provinces were carried out the statistical analysis on their probabilities of fingerprint patterns, demonstrating true of occurring the above-observed stability of probability. The calculation model was built for the probability of the test fingerprint whose five-sort features were thus able to show themselves with possible maximum in the sampled counterpart, thereby verifying the uniqueness of probabilistic representation of fingerprint identification conclusion. Conclusion With the same probability obtained by the established calculation model between the test fingerprints and sampling ones, fingerprint identification conclusion can be represented by the deduced probability.
  • Forensic Science And Technology.
  • Technical Notes
    GAN Lin, JIAO Caiyang
    Forensic Science And Technology. 2016, 41(5): 414-416. https://doi.org/10.16467/j.1008-3650.2016.05.017
    Objective Exploring to apply the likelihood ratio test into barefoot print examination through the quantitative analysis of the test Results. Methods 300 samples of human barefoot prints were collected and statistically recorded of their seven features of toe shape, both leading and trailing edge shape of sole, arch type, heel shape, overlapped digits, and the relation between the central point of the third digit and the line connecting the central points of the second and fourth digits. The probability of each above feature that appears in the barefoot prints was calculated in order to obtain each relevant likelihood ratio and the product (cumulative likelihood ratio) by multiplication of each one. Results Corresponding to the six grades of likelihood ratio, the more involvement of the featured types of footprint, the bigger value of the cumulative likelihood ratio, and consequently the higher strength of evidence. Conclusion The likelihood ratio test can be applied into the examination of barefoot print.
  • Reviews
    ZADORA Grzegorz, MARTYNA Agnieszka, MICHALSKA Aleksandra, WŁASIUK Patryk
    Forensic Science And Technology. 2016, 41(3): 209-220. https://doi.org/10.16467/j.1008-3650.2016.03.010
    The increasing complexity from new forms of crime and the need by those who administer justice for higher standards of scientific work require the development of new approaches for measuring the evidential value of physicochemical data obtained by application of numerous analytical Methods during the analysis of various kinds of trace evidence. The Methods used for evaluation of these data should reveal the role of the forensic experts in the administration of justice. This means that such data (evidence) should be evaluated in the context of two competing propositions H1 and H2 formulated by two opposite sides in the legal proceeding, i.e. prosecution and defence. Bayesian models have been proposed for the evaluation of evidence in such contexts. This paper describes the principle of likelihood ratio (LR) approach for evaluation of physicochemical data in so-called comparison and classification (in fact, classification is also based on comparison) problems. The LR models allow including all of important factors in one calculation run where evidential value of physicochemical data is to evaluate. These factors are the similarity of observed physicochemical data in compared samples, the rarity of determined physicochemical data in relevant population, and the possible sources of errors (within- and inter-sample variability). The LR models, as statistical tools, can be only proposed for databases described by a few variables. However, most of physicochemical data are highly dimensional data (e.g. spectra). Therefore, it is necessary to apply Methods of dimensionality reduction like graphical models or suitable chemometrics’ tools, with examples presented in the paper. The LR models should be always treated as a supportive (not the decisive!) tool and their Results subjected to critical analysis. In other words, the statistical Methods do not deliver the absolute truth as the levels of possible false answers are an integral part of these Methods, in the same way like uncertainty related to the applied analytical techniques. Therefore, sensitivity convergence, an equivalent of the validation process for analytical Methods, should be conducted in order to determine their performance. Thus, how to validate LR models is addressed in this paper by the example of application of Empirical Cross Entropy approach. There is the so-called source-level evaluation for physicochemical data as it helps to answer the question whether the compared samples are originated from the same object. Usually, the fact finders (judge, prosecutor, or police) are interested in recognizing the activity that made transferred and persisted of the recovered microtraces (which reveal similarity to control sample) from body, clothes or shoes. This is the so-called activity-level analysis, also discussed in the paper.