Friction ridge individualization is
a formalized experienced based reasoning process. This process is a form of recognition that has been
formalized to allow for the organization and illumination of supporting proofs
regarding a specific conclusion.
An inductive type reasoning process is rather different from the deductive
process. Essentially, deduction is
the investigative process that proceeds from the general to the particular.
[1] When properly applied
deduction is, in a sense, self-supporting cascade of related facts. Known facts can be used to take the
deduction process to the next cognitive and relevant level. Induction type reasoning is in most
aspects the opposite of deduction.
Induction type reasoning scenarios utilize inference that proceeds from
the particular to the general.
More specifically “Induction is the inferential processes that expand
knowledge in the face of uncertainty.” [2] A general hypothesis or theory is organized from the examination of particular relevant details and
how these details are correlated.
[3] The supporting fact structure is in reverse. Not incorrect, just built from the
particular axioms and logic.
Another difference between deductive and inductive reasoning type
processes is in the formal proofing or testing of a hypothesis. We are all familiar with the summation
of deductive problem solving.
Proofs are logical, orderly, and are generally systematic. Induction processes, on the other
hand, is the inverse and has a conceptual similarity to quantum theory. The parallel is a reliance on the
probability theory in that all information relationships and their values cannot
be sharply defined and must, in an absolute sense, be probability modeled,
whether in strict sense or intuitively. In the words of the physicist Richard Feynman “nobody understands quantum theory.”
[4] Interestingly, there is
no established theory specifically for the induction inference process, and nobody truly understands it either. Complexity, probability, uniqueness, stochastic variables and the related principle of non-specificity, highlighting the uniqueness of the process, outline its use. [5] Axioms must
be applied.
According
to theorists, a theory for the induction process would include two main
goals. First is the possibility of
prediction. Secondly, such a
process would allow for direct comparison where observations can be made. Prediction and direct comparison
are fundamental components of the friction ridge examination process. [6] The problem with identifying a theory then must be within the application and proof of an
inference based hypothesis.
The missing link is the consideration of uncertainty. This
uncertainty is the variability in the quality of information, which is a
component of complexity. Understanding this complexity, the
principle of non-specificity, measurement theory, minimum sufficiency, information theory, and the notion that absolute
proof, which always proves elusive, is the key to fine-tuning this
science.
Regarding friction
ridge examination, inference hypothesis proofs are often based on specific
characteristics of nature found within inter-related concepts such as
randomness and uniqueness. When
the reasoning process is applied, prediction and direct comparisons must be
employed in order to understand the relevance of the information analyzed. With comparative forensic sciences, the
principle of individualization is the
name associated with the concept or axiom of “practical proof.” This is essentially, the truth of the
matter. The principle of
individualization is that the difference between
tangible objects provides a practical proof about the subject’s uniqueness and
the fact that it can be individualized within such limitations [7] outlined by
such things as measurement theory, probability, and communicated via information theory.
With that said, why would uncertainty play such an important role?
Quantum theory
states that our ability to finely measure things such as electrons and photons
is not only beyond our measuring abilities, but rather impossible. A threshold is crossed from classical
knowledge, such as measuring the ballistic dynamics of a bullet, into a realm
that requires statistical modeling to offer probabilities on specific
measurements. Yet, if we think
about it, all of our measurements are approximations, which mean they contain
some degree of error. Furthermore,
we must consider that in most cases we utilize an intuitive form of probability
comprehension that regrettably, enhances the system noise or stochastic variables we associate as uncertainty. With information theory uncertainty is a measure of entropy.
When accuracy is
paramount, we push to measure things to tighter and tighter tolerances yet we
still never reach perfection. Only when we deal with pure mathematics do we see
absolutes. The concept of 1/2 is
absolute, but to measure exactly 1/2
of something is a different problem. This variability is similar to quantum theory in that
accuracy of information regarding inference cannot be precisely quantified in
absolutes. This is partly due to
complexities or stochastic variables of the process, yet it is mainly due to the fact that there can be
an infinite number of ways to approach and
reason a solution with inductive based problems. This makes each application of these cognitive complex functions unique in themselves, this is the "principle of non-specificity" and is measured as aspects of the stochastic noise and uncertainty in the process. A measure of entropy in other words. All information including, both the
application of experienced based reasoning and the analysis of the subject
itself, is on a variable informational scale of availability, discovery, quantity and quality which is called noise in information theory. Accordingly, absolutes actually become sufficiencies or “practicalities.” Practicalities
are hypotheses or particular points at issue that fall within expected and
acceptable statistical parameters of accuracy relating to the issue. In other words, the greater the
accuracy of the information used in the process, the more valuable a resulting
conclusion or hypothesis will be. Error correction within this process can realized with high levels of quality control to include such aspects as evaluation and testing. This is not a perfect solution.
It is a real world solution in light of elusive perfectness. If distortion of information can be accurately understood within a practical sense, the information has a higher degree of
value. This is not unlike most
scientific theories. They are
fine-tuned, yet known to contain some degree of error. However, their main hypotheses are not necessarily in
error. This type of logic can help
us understand how comparative forensic comparison science works, is supported,
and how it can be validated in the face of uncertainty. Information theory and error correction protocols fine-tune for forensic comparison help mitigate uncertainty.
Confirmation of an
inductive type hypothesis is isotropic in that all information may be valid and relevant. Researchers have noted that it is
impossible to put prior restraints on what might turn out to be useful in
solving a problem or in making a scientific discovery. [8] Again, this is related to the introduction of stochastic variables and the principle
of non-specificity where this new information and process will be unique. Friction ridge
examiners are cognizant of the quantity vs. quality variability of information
used in the development of a hypothesis.
The point here is that the very application of reasoning is variable in
quality as is the information itself. Recognition
of uncertainty within the process itself is the missing detail needed for a
proper theory the inductive reasoning process, specifically forensic comparison
science.
How can variability in information quality be
quantified? If it cannot be
measured directly, then it must be assigned values according to statistical
models. Thus, a “statistical
understanding of information quality variability” must be folded into a theory
of induction if the theory is to have practical value. Thus far, the role of uncertainty has
been regulated to a very informal intuitive position within the forensic
sciences. Accordingly, four items
or considerations would be needed for a practical induction-inference theory
are as follows:
Four Considerations Of An Induction- Inference Based Theory
1. Predictability.
2. Direct Comparison with Observation.
3. Stochastic Variable Mitigation / Error Correction
4. Degree
of Uncertainty
The fourth
consideration is essentially the need to understand how the variability of
information quantity and quality, its degree of uncertainty known as noise, affects
induction-based hypotheses within a failure
criterion frame of reference.
This includes both specific issues and the hypothesis as a whole. Thus, degree of uncertainty is a probability based accountability for the
variation of the information’s quality, quantity and error within the process. With friction skin examination, this
may consist of such familiar details as; specific error rates, training,
experience, and the holistic application of reasoning skills in all phases of
the methodology. These points
would need to be quantified in some practical
manner in order to make a hypothesis’s value fully understood. Testing the proofs of an examiner’s
conclusion of individualization or exclusion would have little meaning if the
degree of uncertainty were not considered and appropriately understood in
context. The value of information
components utilized must taken into account. A false analysis inevitably leads to a false conclusion.
With friction
ridge individualization and exclusion, the introduction of low quality information
has a high probability to lead to false positives. This is where the concept of “trained to competency” enters the equation. Training and experience are paramount
in minimizing the inclusion of poor quality or inaccurate information into the
comparison process. It also
reduces inferior reasoning.
Friction
ridge examiners are well versed in the first two topics of such an induction
theory, yet the last two points of non-specificity and uncertainty are often
the point of contention regarding a hypothesis and proof of specific issues,
including the illustration of errors.
Whether the issue is the validity of a single characteristic or the
proof of individualization, all the information being used in the process must
be practically evaluated to ensure that it falls within expected and acceptable
parameters. Our perception
regarding factors of uncertainty should be addressed in a scientific
manner. When the goal is accuracy,
accuracy of information, and accuracy of reasoning are the means.
While
we have outlined what a practical theory of inference must contain, we have not
dealt with the issue that if such a theory is actually possible or how it must specifically fit within Information Theory. It is my perspective that the parallels
of rational consciousness and inference are unmistakably intertwined and
incomprehensibly complex in their holistic nature, yet can easily be defined as unique with unique being a definition of complexity. This leads us to the notion that if consciousness in
non-computable, the same must be said of the isotropic inference processes. The infinities of the systems and
processes cannot be simply cancelled out as they are the embodiment of the process or system.
However, methodologies utilizing this cognitive process can be of value with their power of discovery and information organization.
Our
hypotheses utilizing the inference process can be verified with repeatability
of the results rather than false expectation of "exact repeatability" of the cognitive process as prohibited by the non-specificity principle.
The final comparisons of forensic comparative hypotheses will then allow
for a practical evaluation regarding the role of uncertainty. We can be certain that such an axiom
based complex system cannot be perfect.
However, it is formally scientific, practical, and when the methodology
is properly applied, it is found to be sufficiently accurate. To toss out inductive processes as
unscientific is to toss out a large amount of reality. "Sufficiency" is all we can hope for in any formal cognitive process.
Craig A. Coppock CLPE
May 11, 2005 / Updated May 14, 2017
- O’Hara,
Charles E and Gregory L. 2003: Fundamentals
Of Criminal Investigation,
Charles C. Thomas Publisher, Springfield: p. 885
- J. A.
Scott Kelso: 1999 Dynamic Patterns:
The Self-Organization of Brain and Behavior.
Brandford Books, Cambridge p. 38
- O’Hara,
Charles E and Gregory L. 2003: Fundamentals
Of Criminal Investigation,
Charles C. Thomas Publisher, Springfield: p. 887
- Gribbin,
John ; 1995 : Schrodinger’s Kittens
and the Search for Reality : Back Bay ; New York ; p. vii
- Coppock,
Craig 2004 : A Detailed Look At Inductive Processes
In Forensic Science: The
Detail 4-2004, and
5-2005 clpex.com, updated
8-24-2008 (Complexity of Recognition)
- Holland,
J; Holyaok, K; Nisbett, R; Thagard, P. : 1986 Induction: Processes of
Inference, Learning, and Discovery. MIT, Cambridge p. 347
- Coppock,
Craig 2004 : A Detailed Look At Inductive Processes
In Forensic Science: The
Detail 4-2004, and
5-2005 clpex.com, updated
8-24-2008 (Complexity of
Recognition)
- Holland, J; Holyaok, K; Nisbett, R; Thagard, P. : 1986 Induction: Processes of Inference, Learning, and Discovery. MIT, Cambridge p. 349
No comments:
Post a Comment