Tuesday, January 01, 2013

The Role of Uncertainty Within the Application of Inductive Processes in Forensic Science


        Friction ridge individualization is a formalized experienced based reasoning process.   This process is a form of recognition that has been formalized to allow for the organization and illumination of supporting proofs regarding a specific conclusion.   An inductive type reasoning process is rather different from the deductive process.  Essentially, deduction is the investigative process that proceeds from the general to the particular. [1]  When properly applied deduction is, in a sense, self-supporting cascade of related facts.  Known facts can be used to take the deduction process to the next cognitive and relevant level.  Induction type reasoning is in most aspects the opposite of deduction.  Induction type reasoning scenarios utilize inference that proceeds from the particular to the general.  More specifically “Induction is the inferential processes that expand knowledge in the face of uncertainty.” [2]  A general hypothesis or theory is organized from the examination of particular relevant details and how these details are correlated.  [3] The supporting fact structure is in reverse.  Not incorrect, just built from the particular axioms and logic.  Another difference between deductive and inductive reasoning type processes is in the formal proofing or testing of a hypothesis.  We are all familiar with the summation of deductive problem solving.  Proofs are logical, orderly, and are generally systematic.   Induction processes, on the other hand, is the inverse and has a conceptual similarity to quantum theory.   The parallel is a reliance on the probability theory in that all information relationships and their values cannot be sharply defined and must, in an absolute sense, be probability modeled, whether in strict sense or intuitively.   In the words of the physicist Richard Feynman “nobody understands quantum theory.” [4]   Interestingly, there is no established theory specifically for the induction inference process, and nobody truly understands it either.  Complexity, probability, uniqueness, stochastic variables and the related principle of non-specificity, highlighting the uniqueness of the process, outline its use. [5]  Axioms must be applied.
            According to theorists, a theory for the induction process would include two main goals.  First is the possibility of prediction.  Secondly, such a process would allow for direct comparison where observations can be made.   Prediction and direct comparison are fundamental components of the friction ridge examination process.  [6]  The problem with identifying a theory then must be within the application and proof of an inference based hypothesis.   The missing link is the consideration of uncertainty.  This uncertainty is the variability in the quality of information, which is a component of complexity.  Understanding this complexity, the principle of non-specificity, measurement theory, minimum sufficiency, information theory, and the notion that absolute proof, which always proves elusive, is the key to fine-tuning this science. 
Regarding friction ridge examination, inference hypothesis proofs are often based on specific characteristics of nature found within inter-related concepts such as randomness and uniqueness.  When the reasoning process is applied, prediction and direct comparisons must be employed in order to understand the relevance of the information analyzed.  With comparative forensic sciences, the principle of individualization is the name associated with the concept or axiom of “practical proof.”  This is essentially, the truth of the matter.  The principle of individualization is that the difference between tangible objects provides a practical proof about the subject’s uniqueness and the fact that it can be individualized within such limitations [7] outlined by such things as measurement theory, probability, and communicated via information theory.  With that said, why would uncertainty play such an important role? 
Quantum theory states that our ability to finely measure things such as electrons and photons is not only beyond our measuring abilities, but rather impossible.  A threshold is crossed from classical knowledge, such as measuring the ballistic dynamics of a bullet, into a realm that requires statistical modeling to offer probabilities on specific measurements.  Yet, if we think about it, all of our measurements are approximations, which mean they contain some degree of error.  Furthermore, we must consider that in most cases we utilize an intuitive form of probability comprehension that regrettably, enhances the system noise or stochastic variables we associate as uncertainty.   With information theory uncertainty is a measure of entropy.  
When accuracy is paramount, we push to measure things to tighter and tighter tolerances yet we still never reach perfection. Only when we deal with pure mathematics do we see absolutes.  The concept of 1/2 is absolute, but to measure exactly 1/2 of something is a different problem.  This variability is similar to quantum theory in that accuracy of information regarding inference cannot be precisely quantified in absolutes.  This is partly due to complexities or stochastic variables of the process, yet it is mainly due to the fact that there can be an infinite number of ways to approach and reason a solution with inductive based problems.  This makes each application of these cognitive complex functions unique in themselves, this is the "principle of non-specificity" and is measured as aspects of the stochastic noise and uncertainty in the process.  A measure of entropy in other words.  All information including, both the application of experienced based reasoning and the analysis of the subject itself, is on a variable informational scale of availability, discovery, quantity and quality which is called noise in information theory.  Accordingly, absolutes actually become sufficiencies or “practicalities.”  Practicalities are hypotheses or particular points at issue that fall within expected and acceptable statistical parameters of accuracy relating to the issue.  In other words, the greater the accuracy of the information used in the process, the more valuable a resulting conclusion or hypothesis will be.   Error correction within this process can realized with high levels of quality control to include such aspects as evaluation and testing. This is not a perfect solution.  It is a real world solution in light of elusive perfectness.  If distortion of information can be accurately understood within a practical sense, the information has a higher degree of value.  This is not unlike most scientific theories.  They are fine-tuned, yet known to contain some degree of error.  However, their main hypotheses are not necessarily in error.  This type of logic can help us understand how comparative forensic comparison science works, is supported, and how it can be validated in the face of uncertainty.  Information theory and error correction protocols fine-tune for forensic comparison help mitigate uncertainty.  
Confirmation of an inductive type hypothesis is isotropic in that all information may be valid and relevant.  Researchers have noted that it is impossible to put prior restraints on what might turn out to be useful in solving a problem or in making a scientific discovery. [8]  Again, this is related to the introduction of stochastic variables and the principle of non-specificity where this new information and process will be unique.  Friction ridge examiners are cognizant of the quantity vs. quality variability of information used in the development of a hypothesis.  The point here is that the very application of reasoning is variable in quality as is the information itself.   Recognition of uncertainty within the process itself is the missing detail needed for a proper theory the inductive reasoning process, specifically forensic comparison science.
How can variability in information quality be quantified?  If it cannot be measured directly, then it must be assigned values according to statistical models.   Thus, a “statistical understanding of information quality variability” must be folded into a theory of induction if the theory is to have practical value.  Thus far, the role of uncertainty has been regulated to a very informal intuitive position within the forensic sciences.  Accordingly, four items or considerations would be needed for a practical induction-inference theory are as follows:

Four Considerations Of An Induction- Inference Based Theory

1.     Predictability.
2.     Direct Comparison with Observation.
3.     Stochastic Variable Mitigation / Error Correction
4.     Degree of Uncertainty

The fourth consideration is essentially the need to understand how the variability of information quantity and quality, its degree of uncertainty known as noise, affects induction-based hypotheses within a failure criterion frame of reference.  This includes both specific issues and the hypothesis as a whole.   Thus, degree of uncertainty is a probability based accountability for the variation of the information’s quality, quantity and error within the process.  With friction skin examination, this may consist of such familiar details as; specific error rates, training, experience, and the holistic application of reasoning skills in all phases of the methodology.  These points would need to be quantified in some practical manner in order to make a hypothesis’s value fully understood.  Testing the proofs of an examiner’s conclusion of individualization or exclusion would have little meaning if the degree of uncertainty were not considered and appropriately understood in context.  The value of information components utilized must taken into account.  A false analysis inevitably leads to a false conclusion. 
With friction ridge individualization and exclusion, the introduction of low quality information has a high probability to lead to false positives.   This is where the concept of “trained to competency” enters the equation.  Training and experience are paramount in minimizing the inclusion of poor quality or inaccurate information into the comparison process.  It also reduces inferior reasoning. 
            Friction ridge examiners are well versed in the first two topics of such an induction theory, yet the last two points of non-specificity and uncertainty are often the point of contention regarding a hypothesis and proof of specific issues, including the illustration of errors.  Whether the issue is the validity of a single characteristic or the proof of individualization, all the information being used in the process must be practically evaluated to ensure that it falls within expected and acceptable parameters.   Our perception regarding factors of uncertainty should be addressed in a scientific manner.  When the goal is accuracy, accuracy of information, and accuracy of reasoning are the means.
            While we have outlined what a practical theory of inference must contain, we have not dealt with the issue that if such a theory is actually possible or how it must specifically fit within Information Theory.  It is my perspective that the parallels of rational consciousness and inference are unmistakably intertwined and incomprehensibly complex in their holistic nature, yet can easily be defined as unique with unique being a definition of complexity.   This leads us to the notion that if consciousness in non-computable, the same must be said of the isotropic inference processes.   The infinities of the systems and processes cannot be simply cancelled out as they are the embodiment of the process or system.  However, methodologies utilizing this cognitive process can be of value with their power of discovery and information organization.  
            Our hypotheses utilizing the inference process can be verified with repeatability of the results rather than false expectation of "exact repeatability" of the cognitive process as prohibited by the non-specificity principle.  The final comparisons of forensic comparative hypotheses will then allow for a practical evaluation regarding the role of uncertainty.  We can be certain that such an axiom based complex system cannot be perfect.  However, it is formally scientific, practical, and when the methodology is properly applied, it is found to be sufficiently accurate.  To toss out inductive processes as unscientific is to toss out a large amount of reality.  "Sufficiency" is all we can hope for in any formal cognitive process.

Craig A. Coppock  CLPE
May 11, 2005 / Updated May 14, 2017

  1. O’Hara, Charles E and Gregory L. 2003: Fundamentals Of Criminal Investigation,  Charles C. Thomas Publisher, Springfield: p. 885
  2. J. A. Scott Kelso: 1999 Dynamic Patterns: The Self-Organization of Brain and Behavior.
Brandford Books, Cambridge p. 38
  1. O’Hara, Charles E and Gregory L. 2003: Fundamentals Of Criminal Investigation,  Charles C. Thomas Publisher, Springfield: p. 887
  2. Gribbin, John ; 1995 : Schrodinger’s Kittens and the Search for Reality : Back Bay ; New York ; p. vii
  3. Coppock, Craig 2004 :  A Detailed Look At Inductive Processes In Forensic Science:  The Detail 4-2004,  and 5-2005  clpex.com, updated 8-24-2008 (Complexity of Recognition)
  4. Holland, J; Holyaok, K; Nisbett, R; Thagard, P. : 1986 Induction: Processes of Inference, Learning, and Discovery. MIT, Cambridge  p. 347
  5. Coppock, Craig 2004 :  A Detailed Look At Inductive Processes In Forensic Science:  The Detail 4-2004,  and 5-2005  clpex.com, updated 8-24-2008  (Complexity of Recognition)
  6. Holland, J; Holyaok, K; Nisbett, R; Thagard, P. : 1986 Induction: Processes of Inference, Learning, and Discovery. MIT, Cambridge  p. 349

Tuesday, June 19, 2012

Universal Definition of ACE-V


A Universal Definition of ACE-V used in formal comparative methodology

          ACE-V is an acronym for the established formal methodological comparison process that is analogous to the scientific method.  The purpose of the ACE-V comparison methodology is to individualize or exclude impressions or objects as having originated from an identical source or as being one-in-the-same.  The letters “ACE” are for Analysis, Comparison, and Evaluation.  “V” is the hypothesis-testing or verification phase of the process in which another qualified examiner reinitiates the ACE process in order to see if the original hypothesis of individualization or exclusion is valid from their expert perspective, thus supporting or refuting the conclusions of the original examiner.

ACE-V’s Premises and Processes

The premise and process of the ACE-V methodology relies on the discovery of relevant and unique information that can be used in the comparison and prediction aspects regarding particular information sets.  The process as a whole involves the sequential accumulation and correlation of relevant information that provides for a hypothesis of Individualization, Exclusion, or Inconclusive results.  Inconclusive is where there is insufficient information to individualize or exclude the item, impression, or mark in question.

The analysis stage of the process is a fundamental inventory of available informational components to be used in a comparison.  This information is analyzed for its quantitative and qualitative aspects, as well as its specificity and relevance.

The comparison stage involves the prediction of specific information, which has been identified and spatially located within one impression or object, then predicted to exist within the comparison exemplar.  The comparison itself reveals whether this prediction is valid, invalid, or inconclusive to include the fact that the particular information cannot be compared.  A lack of available comparison area may be due to excessive distortion of information or there is an unavailability of a particular comparison area in the second information set. 

The evaluation in the ACE-V methodology is, in part, a combined assessment of the previous information generated from the first two stages of the ACE-V process.  This assessment relies on accurate training, experience, comprehension of the principles of individualization, comparison bias, and of the ACE-V methodology itself.   The information studied is questioned to determine if it is sufficient for comparison, sufficient in its predictive comparison aspects when considering relevant qualitative and quantitative values and the information can be properly understood within the context of established science. At the completion of the evaluation stage, a formal hypothesis is offered. 

The final stage in the ACE-V methodology is verification.  Verification is a two-part process of peer reviewed hypothesis and failure criterion testing, in which a second qualified examiner examines the established documentation and/or hypothesis and utilized methodology prior to reinitiating the ACE process to test the original hypothesis.  Verification is the culmination (phase transition) of the two examiner’s formal hypotheses into a single verified hypothesis by way of concurrence.  The original hypothesis is stated to be validated by concurring expert opinion.  Individualization (Identification) is the product of the entire ACE-V process properly applied, in that the two formal hypotheses are in agreement. 

            Non-concurring results produced by the ACE-V process can be due to clerical errors, improperly applied ACE-V methodology, variances in expertise, and the discovery of hypotheses error.  When insufficient information is present or discovered, it may not be possible to formally individualize or exclude particular information sets with that utilized information.  The proper result is a conclusion of Inconclusive.  This may include the fact that; potentially matching information sets have not yet been discovered by the investigative search process prior to ACE-V, or that there is insufficient information present to formally and reliably exclude the practical possibility of a match.  In such a case, sequential expert analysis is recommended.

Craig A. Coppock   12-11-2006
                                            Updated 12-12-2017
--------------------------------------------------------------------------------------

Current Note:  This “Universal” version of the well-established ACE-V methodology is an attempt to create a single comprehensive reference of the ACE-V methodology that can be used for all the forensic comparison sciences.  This universal version was built with input and feedback from many examiners from the USA and Canada over several years.  Such a standard reference will also allow focused development of relevant and useful guidelines and Information Theory research that will further the forensic comparison sciences. 

Also see; Science of Elimination Utilizing ACE-V in this blog dated April 2006.  

     Information Theory, in which ACE-V is being researched as a sub-component, allows for a relevant level of error correction in the analytical process.  With cognitive functions, the various system phases within Information Theory can be formatted to promote error minimization and correction.  This can take multiple forms such as research, protocol enhancement, specific and general analysis, testing, experience, training, and other quality control measures.  Periodic review of the process performance can reinforce effective error mitigation.  

     Also see the related post "Scientific Method; Information Theory's Foundation to the Scientific Method" for comprehensive insights on the introduction of error in this process.  To properly understand ACE-V we must also understand the Scientific Method and ultimately its foundation Information Theory. 

This article posted to the blog "Fingerprint Individualization | ACE-V | Scientific Method" at:  http://fingerprintindividualization.blogspot.com. Related information is also posted to ResearchGate.com and Academia.edu

Monday, December 12, 2011

Complexity of Recognition

Complexity of Recognition:
Inductive and Inferential Processes in Forensic Science
 
                          
 
            Recognition is a fundamental inductive cognitive process that we use constantly to understand the world around us.  Induction is the inferential processes that expand knowledge in the face of uncertainty. [1] Without these processes we would cease to function in even the most rudimentary of ways.  We would not be able to understand the value of information, nor could we understand that information in context.  We use our expertise in recognition to constantly assist us with both simple and complex cognitive tasks.  However, even simple recognition tasks, such as recognizing a face, are found to contain a very high degree of complexity.  Fortunately, much of the core of the recognition process is managed by the automated subconscious.  “The great Russian physiologist Nikolai Bernstein (1896-1966) proposed an early solution (to the problem of infinite degrees of complexity).  One of his chief insights was to define the problem of coordinated action, which is analogous to the cognitive process at issue here.  The problem is that of mastering the many redundant degrees of freedom in a movement; that is, of reducing the number of independent variables to be controlled.  For Bernstein, the large number of potential degrees of freedom precluded the possibility that each is controlled individually at every point in time.” [2] The same can be said of how the brain processes information.  How can recognition of a face or fingerprint impression be affected within the context of excess and variable information? Recognition is based on the evaluation of information and informational relationships (linkages) via default hierarchies and relevancy.  This process can inflate with a cascade of information to a point when we have sufficient information to make recognition.  That is, we understand something in context.  Often, we understand it’s meaning as well.  This inflation or cascade is the result of increasing returns on the information gathered from smaller less relevant and often abbreviated individual recognition processes.   Essentially, small bits of information accumulate and stimulate the process.  The very key to this recognition concept, as applied to forensic science, was illustrated by the German philosopher and mathematician Gottfried Wilhelm Leibniz (1646-1716).  His insight was that; all-natural objects could be differentiated if examined in sufficient detail.  This “detail” is information about the subject.  Accordingly, sufficient information would be needed to ensure effectiveness in an analysis.  The same cognitive processes are used whether you are examining objects for their differentiation or to recognize that they are one in the same (identical).  The Belgian statistician Adolph Quetelet (1796-1874) also recognized the value of randomness with his observation “Nature exhibits an infinite variety of forms.”   The difference between objects is information about the subject’s uniqueness. This is called the Principle of Individualization.  It is interesting and relevant that the same cognitive induction inference processes used to recognize natural objects is also used in problem solving, whereas information is compared within relative contexts.  To understand all the applications of the cognitive inductive processes will help scientists better understand specific applications within forensic science.  It will also help scientists understand the limits of such processes.
 
            Unfortunately, due to the infinite complexities of the induction process, there is no current theory that explains the process in its entirety.  However, there are numerous models to describe many of the minutiae involved.   A theory of induction would have two main goals.  The first is the possibility of prediction.  Prediction is the statement of expected results from information not yet analyzed.  For fingerprint identification the various levels of detail would be predictable in their comparative spatial relationships between the exemplars and an unknown sourced impression.  Secondly, direct comparisons with observations can be made. [3] The information must be comparable in some manner in order to have meaning.   The fact that prediction can be made illustrates that the comparisons are with merit.  “While formal theory is a powerful tool, it typically encounters formidable barriers in complex domains such as the study of induction.” [4]
 
            It is argued that “confirmation of a hypothesis is isotropic… meaning that the facts relevant to confirmation of a hypothesis may be drawn from anywhere in the field’s previously confirmed truth and that the degree of confirmation of the hypothesis is sensitive to properties of the whole system.” This is a principal point when one attempts statistical analysis of the recognition process or other such inference-based issue.  “Such holism does indeed pose problems for the philosophy of induction and for the cognitive sciences.”  “It is impossible to put prior restraints on what might turn out to be useful in solving a problem or in making a scientific discovery.” [5] In relation to the inductive forensic sciences, we now understand the limitation of standardized thresholds, however, the key words are “confirmed truth, or more realistically, practical truth.”  
 
We find that with inferential processes the value of the correlated information grows until a point is reached in that one can conclude that sufficient information is available to support a hypothesis.  Any artificial threshold’s relation to the individualization process has the potential to limit the usefulness and extent of the information available.  Likewise, we must be keenly aware that non-relevant information can be detrimental to the recognition process.  This information is what can lead to confirmation bias.  Confirmation bias is information, which cannot be used to support recognition, as it offers no information directly related to the process of recognition itself.  Essentially, information that cannot support prediction with direct comparison cannot be used in hypotheses of individualization.  Thus, confirmation bias would be false assumptions that specific information is relevant.  While circumstantial evidence may be relevant for other investigative issues, it would not be relevant regarding the point of individualization.   Regarding artificial thresholds, limits set on a holistic problem may indeed encourage a deficient analysis of the process when one mistakenly assumes sufficient information is present due to the set threshold itself.   
 
            The testing of a hypothesis of identification is the process of verification.  Here, another qualified examiner re-examines the available information according to established methodology.   The repetition of the verification process is called sequential analysis.  The methodology itself is scientifically designed to allow for repeatable application of the process with the highest accuracy possible.  In general, “scientific laws are general rules.  Scientific ideas are concepts that organize laws and other rules into useful bundles.  Scientific theories are complexes of rules that function together computationally by virtue of the concepts that connect them.  Moreover, these concepts are often of a special sort, referring to entities and processes not directly observable by scientific instruments of the day.” [6] Many of these processes involved in the cognitive inductive process, such as recognition, involve mental functions based on relevancy and knowledge.   These aspects are truly difficult to quantify.   Accordingly, comparative forensic science that utilizes pattern recognition, deals in practical proofs, whereas the term practical applies to the mathematical and logistical limitations of a science-based application.  “…There will never be a formal method of determining for every mathematical proposition (in our case prediction and comparison) whether it is absolutely true, anymore than there is a way to determine whether a theory in physics is absolutely true.” [6a] Furthermore, mathematician Kurt “Gödel showed that provability is a weaker notion than truth, no matter what axiomatic system is involved.” [6b]
 
Fortunately, we do not need to know all the available information in order to understand the general probability that our conclusion of recognition is correct.  We will reach a point in our analysis of the matter, when we can say we have recognized a particular fact.  What we are often blind to is, that we rely on related experience to understand, or make sense of, the recognition process.  It is also fortunate, that when we do not have enough information to complete a specific recognition, we understand that we lack the needed information to complete the task or that we do not understand all the information presented.  Accordingly, both recognition and non-recognition are based on the analysis of available information.  An example is, when we see a friend, we can recognize them while viewing only a few percent of the total available information.  We see only one side of a person at a time.  We do not always need to see their opposite side as well. [7] Likewise, we can often make recognition while viewing only a person’s face or part of their face.  Accordingly, if we do not get a good look at the person, we are confronted with the possibility of not being able to recognize them due to insufficient information.   Specifically, regarding the recognition process, it is not necessary to completely understand all the information before proceeding with an analysis.  “Early selection is the idea that a stimulus need not be completely perceptually analyzed and encoded as semantic (meaningfully linked information) or categorical information before it can be either selected for further processing or rejected as irrelevant.  [8] However, it is thought that the rejected information is not always completely ignored, but rather reduced to a lesser role in the cognitive analysis.
 
A phenomenon called hysteresis is “that when a system parameter changes direction, the behavior may stay where it was, delaying its return to some previous state.  This is another way of saying that several behavioral states may actually coexist for the same parameter value; which states what you see depends on the direction of parameter change.  Where you come from affects where you stay or where you go, just as history would have it.” [9] With recognition, the information can be continuously variable.   Distortion and variability of information are not exceptions to the rule they are in fact, the norm.  Pictorial scenes may be familiar throughout different lighting conditions and seasons.  Yet, much of the information found in a pictorial scene is not distorted to the point it can’t be recognized for what it is.  In most cases we would not expect to see the same exact scene again even though we may expect to recognize the same person, place, or item.  Our next glance will most likely come from a new perspective, at different hour, and possibly a different season.  Thus, while color, contrast, and perspective may be variable, there remains the possibility of recognition provided sufficient information remains and that the information falls within acceptable probability modeling expectations or experience.  This is sufficiency of information.  Accordingly, if we consider an enhancement process such as filtering, we can begin to understand that the process of enhancement only needs to be specific relative to the information that is to be enhanced.   Non-relevant information can be safely discarded if it does not interfere with the value of the information being analyzed.  Also, it is important to note that specific enhancements need not be exact or standardized.  Again, sufficiency is the key.  There is no scientific basis for exact duplication of enhancement processes.  “Human subjects… can easily adopt a variety of strategies to solve a task.” [10]
 
            While many forms of cognition are based on analogy [11], recognition furthers this basic thought by adding comparative detail analysis with both the conscious and subconscious mind. Recognition starts and ends in the brain. The conscious and subconscious minds work together utilizing as much data as they can effectively process.  This information is drawn from experience as well as current evaluations of the environment.   The actual moment of recognition can be described as the moment of positive recognition or MPR.  MPR is defined as; an affirmative decision of identification based on the accumulation of contextually compared and experience-based information that falls within predictable and logical statistical parameters.  The information above and beyond this point of sufficiency is simply additional supporting data that may not be needed for further practical use.  This additional information is not necessarily needed to further support the recognition, yet we often make ourselves aware of it.  It follows reason that the more information available for the recognition process, the lower the consequence of uncertainty.  Yet it is important to understand that information must be paired with other preexisting information to have meaning.  In the case of friction skin identification, this additional information is often analyzed to some degree to ensure accuracy of a hypothesis of individualization or exclusion, whereas additional predictions and comparisons are made of the remaining information.  This additional information is also available for sequential analysis.  See figure 1.  In cases of recognition, the same exact information is never used twice to effect individualization.  Holistically derived inferential information cannot be run through a standardized formula.  I would consider this a principle of non-specificity in the process which is like Claude Shannon’s noise in communication theory.  The use of specific information is not necessary, but rather the information must be relevant and sufficient for the purpose.  The reason that the same exact information cannot be duplicated is due to the infinite variability in where the potential information can come from, and what and how those portions of information are used.  It is the result that counts.  The information needed to sustain recognition (or individualization) may be found anywhere in the supporting information data set.   This principle of non-specificity explains how each analysis is unique and thus one person’s comparison results cannot be exactly duplicated.  
 

While we may assume, we are talking about the same data, this is found too never be [exactly] the same.  With fingerprint science, even a single Galton characteristic will be understood within its relative context, slightly different amongst examiners.  Interestingly, two different fingerprint examiners could individualize the same friction skin print and not even compare any of the same data.  Each examiner could be shown a different section.  This would be analogous to two separate persons recognizing the same individual from opposite directions.  Each person would see different data yet reach the same conclusion.  Uniqueness can be thought of as a collection of unique [yet non-specific] relevant data sets.  The examination of any set, or blend of sets may be sufficient for individualization based on that person’s knowledge.  This could be considered a sufficient set.  Verification would be considered another sufficient set.

 

Accordingly, we must realize that recognition or forensic comparison is not dependent on repetitious analysis of specific data, but rather sufficient relevant data.  Perhaps when cognitively dealing with the concept of uniqueness and ultimately understanding its place in the infinite world, we will realize that we also draw from the same breadth.  The relevant information that supports the understanding of recognition, and ultimately formalized individualization, is an information warehouse to be mentally rummaged through whenever needed.
 
i Log Recog sm
Sufficiency of information for recognition is highlighted in figure 1.   A point is reached in the processing of information in that a competent and experience person, understands that the value of the information is within reasonable statistical expectations to establish a practical, although ultimately, non-provable hypothesis.  Thus, the moment of positive recognition is realized, and a transitional phase is transcended.  Most of our cognitive processes rely on this informal rapid assessment of information sets. This in turn, allows us to build our knowledgebase [experience] and iterate that updated knowledge into a newly formed question that keeps our cognitive awareness and problem solving process moving forward.[11a]  As the graph illustrates, all available information need not be analyzed to accomplish the task.  It is not necessary to know all there is to know.  It simply must be sufficient within established expectations.  Keep in mind that apart from non-tangible informational rules including mathematical formalism, (mathematical reality vs. reality) all forms, and applications of science rely on some sort of interpretation of information.  With interpretation comes uncertainty.  Accordingly, the goal is accuracy within practical applications of the process, and practical means that it would be within limits of statistical acceptability generally known as common sense.  It is interesting to note that even within mathematics, many mathematical truths cannot be proven true or are proven to be ultimately incomplete within themselves.  Furthermore, we must ask ourselves if mathematics, used as a descriptive tool, can operate without the use of language as a guide, or can statistical evidence have any meaning within the notion of infinity.   The point is; how can we be sure of anything if all else has not been ruled out?  Uncertainty is universal rule that dictates a mandate of practicality.  So, in a sense, not only does this lower the ultimate value of mathematics’ scientificness in particular, but it also reminds us of our mistaken supposition that everything scientific must be perfectly described and proven.  Hence, forensic science’s pattern recognition disciplines fall easily into this improved version of scientific reality.  In essence, we should not underestimate the value of common sense and our learned experiences, nor should we forget that objectiveness is often a formal process organized subjectively.  
 
            The brain is an amazing organ.  It has the capacity to process immense amounts of information simultaneously.  The brain also can put that information into a usable and understandable context for later reference.  The process of recognition is formed wholly inside the brain utilizing information that is further deduced from related information that was analyzed regarding a particular issue, as well as from relevant information based on experience.  The brain has specialized areas that are noted for their specialties.   “The use of triggering conditions obviates random search through the inexhaustible space of possible hypotheses.”  “Triggering conditions activate the induction inference process allowing problem solving on that information which is different or non-expected.  Thus, a cognitive process can direct its inductions according to its current problem situation, generating rules that are likely to be useful to it at the moment and hence possibly useful in the future as well.” [12]
 
Attention is a cognitive brain mechanism that enables one to process relevant inputs, thoughts, or actions while ignoring irrelevant or distracting ones…” Fingerprint identification and other forms of formal recognition use voluntary (endogenous) attention, whereas, an examiners attention is purposefully directed.  This contrasts with reflexive (exogenous) attention in which a sensory event captures our attention with new information. [13] Interesting and important, the neuroscientific study of attention has three main goals.  One of which is relevant here.  This goal is to understand how attention enables and influences the detection, perception, and encoding of stimulus events, as well as the generation of actions based on the stimuli.  [14] The relative concepts related to many forensic analysis processes also include the aspect of spatial attention.  Spatial attention is the endogenous attention relating to some location(s) while ignoring or devaluing others. [15] With the task of recognition, certain parts of the brain become very active.  This increased and localized activity can be studied and monitored.   One researcher excitedly stated: “Your whole hippocampus is screaming!” [Much activity was noted in] a structure adjacent to the hippocampus known as the fusiform gyrus; this too, was not a surprise, ... Recent research on face recognition has identified this as the key area in the brain for the specialized task of perceiving faces.  What was surprising was that the most excited sector in the brain as it viewed familiar faces was, once again, the “story telling area.” [16] On the other hand, researcher Jim Haxby has found that in addition to localized activity, object recognition may rely on multiple areas of the brain. [16a] But, how does recognition work?  Did Einstein or Newton have an enlarged or well-exercised fusiform gyrus?  
 
            We all remember the story of Newton and the falling apple.   Albert Einstein imagined what it would be like if he were riding on a light wave and recognized that the speed of light is relative.  If you shine a light out of a moving train it does not add up!  C + 60mph = c.  Prior to that, in 1858, Alfred R. Wallace, while sweltering in a fever on the island of Moluccas “there suddenly flashed upon me the idea of the survival of the fittest...then, considering the variation continually occurring in every fresh generation of animals or plants, and the changes of climate, of food, of enemies always in progress, the whole method of specific modification became clear to me....” [17]   This in turn, fueled Charles Darwin's fire on evolutionary theory he had been developing since 1842.   In about 250 B.C. Archimedes was pondering over a problem for the Greek King Hieron II about measuring the content of gold in a crown.  He realized that copper has a density of 8.92 gcm and gold about double that, in Archimedes equivalents.  Archimedes thought there must be a solution to the problem, even though the mathematics of the time did not allow for such complex calculations.  Regardless, Archimedes took his thoughts to a public bath for some relaxation.  The bath ran over its edges as Archimedes displaced the water.  “And as he ran (naked), Archimedes shouted over and over, “I’ve got it! I’ve got it!”  Of course, knowing no English, he was compelled to shout it in Greek, so it came out, “Eureka! Eureka!”[18]  The MPR had been reached.  The physicist “Roger Penrose looks to ties between the known laws of quantum mechanics and special relativity for clues to human consciousness.  According to Penrose, it will be possible to elucidate consciousness once this linkage has been established and formulated into a new theory…  For Penrose, consciousness, together with other attributes such as inspiration, insight, and originality, is non algorithmic and hence non-computable.”  When one “experiences an “aha” (recognition) insight, this cannot, he thinks, be due to some complicated computation, but rather to direct contact with Plato’s world of mathematical concepts.”[19]  What this means is that we cannot program a computer with “consciousness.”  Thus, true recognition processes can only be mimicked by computers.  Once the quality of the information used in the process is degraded in some fashion, in that the information is no longer ideal, the usefulness of the computer becomes apparent.  The human-element is essential for the detailed interpretation of computerized results of a recognition process.  Whether or not human evaluation or oversight is implemented is beside the point. 
 
Artificial intelligence is an effort to duplicate experienced based learning via a neural network computer simulation.  However, even the best simulations are rudimentary in comparison to the human equivalent.   Computers (and programmers) cannot grasp the nuances and subtleties we derive from the river of information we use in everyday decision-making.  We are not static; we are non-equilibrium learning machines.  Furthermore, computers cannot be programmed to draw relevant information from all “available” sources, nor can they constantly learn in real-time.  Computers are good at algorithms we prebuild for a specific application and desired output.  We on the other hand, are good at compiling, organizing, and evaluating incomplete information based on our relevant experience that we constantly improve.   Rational consciousness working on dual levels, including the sub-consciousness, continually evaluates relevant information or a particular solution.  This solution has the potential to be solved or go unsolved.  Considering variable goals, real-time processing combined with experienced based contextual information, rational consciousness draws from the past, present, and the future via prediction, within infinite possibilities of the parallel recognition / problem solving process.  Computers have the simple task of everything less!  Can computer eventually be intelligent, or will they simply remain just very clever tools?  Is the fact that human cognition is “analog gray” as opposed to “binary digital” the very reason we can filter large quantities of spontaneous information, reason undefined problems, develop useful insights, and build fast efficient computers to mimic our work?  Artificial intelligence is an answer that doesn’t understand the problem, yet its extensive ability to do work will prove its worth and revolutionize our lives on both large and small scales. 
                  
            Some characteristics of recognition can be studied by focusing on aspects of the cognitive process.  “Visual perception plays a crucial role in mental functioning.  How we see helps determine what we think and what we do.  ...Denis G. Pelli, a professor of psychology and neural science at New York University, has had the happy idea of enlisting the visual arts.” Study in the area of the visual arts has “disproved the popular assumption that shape perception is size-independent.[20]   Of course, this too, is relative.  When viewed at extremes, shape is size dependent from a human perspective.   Aristotle noted that shape perception could be independent of size only for sizes that are neither so huge as to exceed our visual field, nor so tiny as to exceed our visual acuity.[20] Size and shape are forms of information.  Thus, we can assume that all information, including that used for recognition, is also relative.  Other extremes are also noticed whereas too much non-correlated information cannot be processed effectively, and too little information will not yield sufficient relationships for a useful comparative analysis.  Thus, recognition cannot be supported.  The less information that is available for the recognition process, the more time and cognitive effort must be applied to the evaluation of the information.  Eventually a point will be reached that, relative to one’s ability, further analysis will not result in a positive recognition.   Therefore “hindsight is 20/20.”  After the fact “more contextual information is usually present,” making relationships of relevant information supplementary distinct… and obvious.
 
            The actual point of recognition (MPR) is not definable in everyday terms.  Sufficiency cannot be clinically quantified as to what is taking place and when it is taking place, yet we know when we have enough information about a particular object or event… depending on our needs at that time.  MPR would be very difficult to probability model due to the infinite number of variables and overt complexities.   All we can hope for is a rough statistical model.  Therefore, there is not a current theory for the induction process.  Unlike DNA’s firm grasp on statistical modeling, most of the forensic sciences, ultimately based on the inferential recognition process, are too complex and variable to explain with precise numbers.  This means that it is impossible to address precisely.  
 
We gather and process information, streaming in time, via input from our 7 senses plus our ability to process these inputs via abstract thinking.  Each ‘thing’ we interface with is a sum of those inputs in various ratios representing an aspect of the information.  For familiar ‘things’ (collections of atoms, molecules etc…) we can often infer and predict cross sensory input and/or recall it from memory due to past experiences.  Similar things can be inferred in the interaction with us, whereas unfamiliar things may require more sensory interface to be sufficiently understood.  We can often understand a ‘thing’ and a ‘thing’ in context with a single sensor input or say a single input and the recognized lack of other inputs.  A quiet cat vs a noisy dog. We already know it’s a cat from past sensory inputs recalled and recognize, we have built a very large comparative knowledgebase.   Essentially the negative sensory input, no sound, is also valuable information when properly understood in context amid ambient sensory noise.  More time spent with the cat does not help us better recognize this cat, but more information can help us better understand this particular cat, its behavior, its uniqueness beyond simple categorization.   Think of that story told by an acquaintance that you fully understand in just a few words, yet you must endure a lengthy unnecessary and highly detailed story.  Our daily information interface with the universe wishes to avoid this gross inefficiency.  We don’t need all the available information… ever.  In fact, we have survived millions of years rapidly processing and utilizing just very small fractions of information and can gather more information when needed or desired.  This is sufficient.  The MPR is just one step of our informal grayscale categorization process, and it is variably depended on our experience and ability to leverage that experience. 
 
An old analogy to our inability of finding exactness in most of nature has been described in Zeno’s arrow or as Zeno’s paradox.  This paradox involves a bow, an arrow, a measuring system, and a target.  In imagining the arrow shot at a target our measurement theory question becomes; when exactly will the arrow hit the target?  Can we not ‘perfectly’ predict when the arrow will impact?  Even if we know the speed, drag and distance, we will have to forget the notion the arrow will always be only halfway further to the target used in our chosen measurement system.  If it is always getting halfway further in time, when do you run out of “1/2” the finite distance to paradox goes.  On the surface it seems the issue of time does indeed become a paradox. Of course, we can get a suitable answer utilizing calculus, a different measurement system that leverages the fuzziness of infinity.  Yet ultimately, we must ask ourselves as to which measurement system, including the limitation on our application of that system, will we use and how accurate do we need to be for our application?  Thus, even the simple task of measuring the time of impact for an arrow at its target can be frustratingly complex and imperfect unless we are allowed to make some real-world assumptions and approximations for the purpose of practical simplification.  “Beneath the sophistry of these contradictions [of the paradox] lie subtle and elusive concepts of limits and infinity.” [20a] “The big problem with Zeno’s Paradoxes is that the infinite subdivision of a finite period of time… does not correspond to a sequence of physically distinct operations.” [20b] Thus, classical mechanics demands that we round our numbers (and probabilities) and this in turn allows us to make practical real-world explanations.  Accordingly, measurement theory employs needed axioms and considers inherent error.  This brings us back to the question of what information is necessary or sufficient to allow for a “practical” or usable conclusion?  Do we really need absoluteness, exactness, and predictions to understand the reality or science of a situation?  Fractals are a prime example, whereas we often prefer to round out the edges of that which we wish to measure, even when we know there is detail there, it simply may not matter for our goal. 
 “Predictions are nice, if you can make them.  But the essence of science lies in explanation, laying bare the fundamental mechanisms of nature.” [21] Most of these explanations arrive in the form of scientific models, according to the scientist John Von Neumann. [22] Apparently, we are forced to work with less and our estimates, rounding, and rough predictions are required for us to make daily progress on understanding the world around us.  We must remember that holistic and overly complex processes cannot be reduced to a mathematic formula for critical proofing.  This is the reality of the recognition process; this is the reality of inference logic.
 
            We now understand that our ability to measure things accurately is simply relative, no matter what type of measuring devices we use whether it be a meter, a second, or statistics.   This apparent failure is due to the loose tolerances required to make everyday issues solvable in a practical sense.  Accordingly, we must accept a certain amount of tolerance in the answers to our questions.  Recognition also follows this path, as we are not given the opportunity to exactly define the MPR.  This is due the variables in information and how that information is analyzed.  There is no specific order in which a person must analyze information during the recognition process.  Some paths of information correlation may offer the MPR sooner than if another alternative path is taken.  This includes the analysis of the distortion of information as well.  Distortion is always present to some degree, whether it is recognizing your cat or in the analysis of a latent fingerprint.  Omnipresent distortion of naturally repeated information is a law of nature, yet we seem to deal with this aspect just fine.  This is another aspect of the individualization principle.  The main reason for this is because we always receive information differently and that natural patterns cannot be perfectly duplicated or replicated, or it would violate the very premise of the individualization principle.  We are used to understanding information presented in an infinite number of ways.  We understand that each time we evaluate "something," at least some of the previously available information will be different.  Hence, we are experts at recognizing distortion and variability in the recognition process.  We can often disregard information that falls in extreme categories simply because the influence they may have, is often insignificant, or that other sufficient information is present.   Of course, it follows that unintelligible information is not used in the recognition process.  Accordingly, if insufficient information is present, then recognition is not possible. 
 
            In many cases, as with distortion, it is only when we wish to view the component items of a problem, do we see them clearly.   It seems that, for the most part, our cognitive world is based on generalizations, analogies, and our ability for recognition.  The boundary of the classical reality is a boundary of informational usefulness and practicality.  The rounding of numbers is part of that practicality.  We need to understand the variables in their correct context to understand what recognition is, let alone, to attempt to measure it.  Within the forensic sciences, we must understand the limits of information to understand its usefulness.  Distortion can also obscure information, thus preventing recognition.  Distortions of friction ridge information can be found in a wide variety of forms.  What is realized, is that information is found embedded within other information and distortions of this information may only partially obscure specific details.  However, we are all experts at the recognition process.  We also have considerable experience dealing with various levels of distortion.   Experts that practice specialized forms of recognition, such as friction skin identification, shoe print identification, etc... can also be effective and accurate in that specific recognition process if they are sufficiently skilled in the applicable areas.  Essentially, they must be as comparatively skilled as a parent recognizing their children.  For each aspect is a specialized and learned skill that requires considerable experience.  “Special aptitudes for guessing right” reflect Philosopher C. S. Piece’s belief that constraints on induction are based on innate knowledge.  [23] Of course, knowledge is the product of learned experience that is understood in context.
 
            “Sometimes the variability of a class of objects is not as immediately salient….  If variability is underestimated, people will be inclined to generalize too strongly from the available evidence.  Similarly, if people do not recognize the role of chance factors in producing their observations, they are also likely to generalize too strongly. [24] Distortion of impression evidence is an example of variability.  The distortion must be understood in its degree of underlying information destruction.  Experience again shows its importance.  “…People’s assessment of variability, and hence their propensity to generalize, depend on their prior knowledge and experience.” [25] “Perhaps the most obvious source of differences in knowledge that would be expected to affect generalizations is the degree of expertise in the relevant domain.  Experts – people who are highly knowledgeable about events in a specific domain – could be expected to assess the statistical aspects of events differently from non-experts.  Experts should be more aware of the degree of variability of events in the domain and should be more cognizant of the role played by chance in the production of such events. [26]
 
            In the realm of all possibility recognition is not always absolute. However, within the context of the human population recognition can, in many forms, be absolute in a practical statistical sense.  For example, when properly effected (recognized), friction skin identification is far from extreme, and thus, can be valid for practical applications.  The inherent uniqueness of friction skin supports the quantity of information needed for recognition.  Provided that sufficient information is available in an impression of that friction skin individualization can occur as recognition can be supported with prediction and comparisons of the analyzed information.
 
            Fundamentally, there is no significant difference within degrees of recognition.  With a computer there exist preset thresholds and a lack of comparable experience, etc… that prevents true recognition.   Specifically, this relates to the recognition or individualization of a person.  Since computers cannot truly recognize individuals, whether using fingerprint or facial recognition programs.  Computers simply compare preexisting data to new data without the ability to understand information within experienced based context, nor can a computer “recognize” distortion.  Statistical independence values must be tallied with quantifiable information not holistically gathered information based on experience.  In fact, computers must use artificially high thresholds as they do not have these capacities.  Without such a limit, computers would have little value to offer regarding assistance with the recognition process due to a high number of false positives.  Computers are effective at sorting large quantities of simplified, known data against other data.  Unfortunately, there is no concept of recognition here.  Another limitation of a computerized recognition system is that a computer cannot effectively verify its own product.   If the computer is in error, then running the same program with the same data would result in the same error.  This illustrates the need for human intervention.  With respect to recognition, this is also where experienced based analysis is beneficial.  The ACE-V methodology used by friction ridge and many other forensic examiners follows the general outline of the scientific method complete with verification.  Verification is required following scientific parameters on testing procedures.  The verification step is also necessary for other specialized applications in forensic science.   We still need to verify recognition to make recognition valid in a scientific sense, as provisional recognitions are very informal in application of the process or methodology used to understand recognition.  While subjectiveness plays a role, it does not necessarily influence the verification or proof of recognition.  It mainly affects the investigative aspects.  Regardless, a certain amount of this subjectiveness is inevitable in most scientific endeavors and is simply a byproduct of the analysis of complex issues found throughout nature.
 
Regarding fingerprint identification, computers can, within very tight informational parameters, highlight probable match candidates during a database search.  It can even be said that, assuming the parameters are tight enough, and the information of sufficiently high quality, a computer could provisionally identify a person within certain acceptable and practical probabilities.  Friction ridge examiners use these computerized databases to sort through the billions of bits of data, yet when the quality and quantity of the information falls below a certain threshold, the usefulness of the computer is reduced.   Again, this is due to the simplified comparative analysis model a computer uses.   Humans have the advantage of default hierarchies.  “Default hierarchies can represent both the uniformities and the variability that exist in the environment.  This representation serves to guide the kinks of inductive change that systems are allowed to make in the face of unexpected events.”  An example of this can be illustrated with a simple form of recognition.  Imagine that you recognize your cat sitting on the windowsill of your home.  Without realizing it, the myriad of information about that cat is automatically evaluated by you in the normal process of recognition.  
 
How much information do you need to evaluate before you know that the cat is, indeed, yours?  We find that with the average “everyday” form of recognition, a considerable quantity of the information is processed automatically by our subconscious.  The rest of the available information is ignored.  It is simply not needed.   Much of the information is contextual information.  If we take away just one part of the information, such as “location,” we may run into considerable difficulty with our recognition process.  This is because much of the information we process is embedded in relative informational relationships.  An example is considering the same cat now located halfway around the world.  If someone shows you a recent photograph of a cat on the island of Fiji, you cannot always ‘at a glance’ recognize the cat as yours, unless you know there was the probability of your cat arriving there on vacation without you! We may tend to assume a missing variable in our ability to understand, perhaps this cat is very similar in appearance, or some sort of hoax that is outside the reality of our expected estimated probability.  We need more information.  There is a significant amount of embedded information related to the simple aspect of “location” including associated probabilities.   This illustrates some of the complexity of the recognition process.  It doesn’t mean that recognition cannot be made.  It simply elucidates that specific information may need to be present to reach MPR.   One must also consider that other less relevant supporting recognitions will have to be made before a main point, issue of recognition is made.  If the less relevant sustainment type recognition processes cannot be supported with information, then it follows that the main issue is threatened with the possibility of non-recognition, at least until a point when further research could be made.  “Inherent in the notion of a default hierarchy is a representation of the uncertainty that exists for any system that operates in a world having a realistic degree of complexity.”  [27]
 
            Interestingly, it is not necessarily an error in failing to recognize the cat or other item until such a time that the recognition has been proven positive, such as with some sort of detailed analysis.  When that interpretation has been proven incorrect, a false negative error has taken place.  We must remember that people will use a different path in their recognition process.  Their recognition process may not be complete or effective at reaching a correct MPR answer under a particular set of circumstances.  This contrasts with a false positive.  A false positive would be where the person recognized a different cat as being their cat.  This would be due to an error in the interpretation of the information itself, not in its value.  However, this does not mean the information was incorrect or distorted beyond usefulness, nor does it invalidate the recognition process.  This again lends to the fact that investigative phase of the recognition process is subjective and relies on holistic experience-based knowledge as outlined in the non-specificity principle on page 5.  This subjectiveness is because much of the recognition process involves a mental investigation of the informational facts.  This can be summed up with the statement:  Investigation is an art.  It is when the comparative analysis deals with practical proofs relating to the evidence that the recognition process completes its evolution into a science.  This proof could take the form of forensic analysis or of other practical and probability modeled forms of detail analysis.  The proof of recognition deals with the product itself rather than the process by which the recognition was made.  Yet even the process itself can be fine tuned to allow for rigorous evaluation of information.  By structuring the analysis process in alignment with the scientific method, accuracy in the interpretation of the information at issue is improved.  Therefore forensic examiners must be trained to competency, including experienced in the accepted process or methodology to be used.
 
            The recognition process experiences phase transitions as information is evaluated at different levels of specificity within a network of interrelationships.  These levels are informational relationships structured like Metcalfe’s Law.  These relationships also include the many related yet minor supporting recognitions.  Phase transitions are a common feature of complex physical systems as components of a system are influenced variably at different levels.  Scientist Phil Anderson explained the trend in a 1972 paper.  “At each level of complexity, entirely new properties appear.  And at each stage, entirely new laws, concepts, and generalizations are necessary, requiring inspiration and creativity to just as great a degree as in the previous one.  Psychology is not applied biology nor is biology applied chemistry” [28] We see a similar effect with recognition as the process transforms from general and experience-based information correlation, to minor supporting recognitions, then on to a main issue MPR. The verification process of forensic science is in essence, a phase transition.  For verification, two individual hypotheses are compared.  The actual comparison of the existing information creates new data, which may or may not support an individual hypothesis.  Each phase of the process must be treated differently as each phase of the process, is indeed, different, and replete with unique information.  In essence, these phase transitions are the complexity of the induction process. [29] “The theory of self-organized pattern formation in nonequalibrium systems also puts creativity and “aha” (recognition) experiences in a new light.  The breakup of old ideas and the sudden creation of something new is hypothesized to take the form of a phase transition.”  “The brain here is… fundamentally a pattern-forming self-organized system governed by nonlinear dynamical laws.” [30].  Much unlike a computer.
 
            Eventually, the main question relating to a scientific inquiry is asked of the recognition process.  Can recognition be proven?  Many forms of recognition can be proven positive through verification that allows prediction and comparison of details.  This can often be achieved even though embedded information is often difficult to separate and quantify.  These interrelated packets of information are processed by the brain in a comparative and contextual fashion.   The hippocampus and the fusiform gyrus are doing their work.  Proof that this work or process has proceeded accurately is the study of the uniqueness as well as other supporting data as it relates to the recognition process.  This, of course, depends on the details of the recognition itself, the analysis of the information.  If the recognition is of a solution for a problem, then the data must be supported accordingly.  Predictions and comparisons would be possible.  If the recognition is of an individual item, person, or place, then the data relevant that issue must also be supported.  If the recognition process has sufficient detail available for study, the product of the recognition can possibly be proven.  Again, the key is sufficiency.  “Sufficiency” is a relative and variable quantum that establishes facts within specific probability models.  
 
            Uniqueness and information availability have laid the foundation for the possibility of recognition.  Recognition relies on the fact that nature is designed and built in a random fashion and that differences in objects and people allow for a separation of that information into meaningful and identifiably distinct groups.  This is concept is illustrated in the previously mentioned work by Leibniz and Quetelet.  These groups of information are further compared chronologically and spatially within expected statistical results that are subsequently based on experience.  With friction ridge identification, the scope of recognition is somewhat isolated from high quantities of extraneous information the average person is used to dealing with daily.  The boundaries in which the information being evaluated is easier understood within the forensic sciences.   Also, the packets or groups of information are more distinct and easier to quantify in practical terms.  This assists with the verification aspect.
 
            Ultimately, the underlying dilemma of recognition is how can such complexity be quantified for statistical evaluation?  Of course, it cannot be precisely quantified.  Only the simplest of informational subsets can be quantified with any definitiveness.   When dealing with these complex issues one must concentrate on the effectiveness of the methods as well as the rates of error when the method has been applied.  It would not be correct to cite the statistical rates amongst the method’s components.  Except for verification, the component parts do not function independently regarding recognition.  Accordingly, they have no individual rates of error.  Likewise, recognition is a process with an infinite number of component parts or parameters.  Regarding the process of recognition relating to friction ridge identification, these components have been grouped together into related fields.  These include analysis, comparison, and evaluation.   Regarding parameter dynamics, “the central idea is that understanding at any level of organization starts with the knowledge of basically three things:  the parameters action on the system… (boundary conditions), the interaction elements themselves…, and the emerging patterns or modes (cooperativities) to which they give rise.” [31]  
 
            Friction ridge identification, as well as the other forensic sciences that utilize the recognition process, do so in a formal, rather than an informal manner.  This difference is what lends the reliability, accuracy, and verification to the process.  A formal form of recognition requires that the information (evidence) be interpreted correctly and in the proper context.  This is achieved via sufficient training, proper methodology, and experience.  This, in part, is the separation between the art and the science of the recognition process.  Verification or attempts at proving recognition promotes the process to a formal process.  With a formal process of recognition, we are entirely cognizant of the process itself.  The process is controlled in a manner that follows the scientific process to allow for the minimization of subjectiveness.  Compare this to an informal form of recognition and we begin to understand how subjectiveness can be minimized and a recognition proven.
 
            All forms of recognition require experience because it is the “experience factor” that most influences the contextual issues in the recognition process.   With the proper faculties in place, the underlying science of the recognition process can be liberated.  With facial recognition this equates to the difference of accuracy between recognizing your mother and trying to recognize a person whom you have only seen in passing.  With one, you can be certain, with the other, well....... you need more information.  The question would be how does an inductive process work?  One hypothesis of inductive problem solving is called means-end analysis.  Even though general inductive problem solving cannot be contained with a simple means-end analysis this outline offers some interesting insight.  A. Newell outlined this approach in his 1969 publication “Progress in operations research.” [32] His outline consisted of four steps that, incidentally, closely models Dactyloscopy. 
 
            Means-End Analysis
1.     Compare the current state to the goal state and identify differences.
2.     Select an operator relevant to reducing the difference.
3.     Apply the operator if possible.  If it cannot be applied, establish a sub-goal…
4.     Iterate the procedure until all differences have been eliminated or until some failure criterion is exceeded.
 
Of course, means-end analysis relies on experience-based decision-making.  The interesting point about induction-based analysis is that there is no correct or set way to perform inductive inferences.  Likewise, each instant or comparison would be approved differently.  Experience plays such a major role on how to approach each issue.  Accordingly, all we can say is that there are efficient and inefficient approaches to problem solving.  However, even the most efficient method cannot guarantee the problem can be solved or the latent print can be identified.
 
            Summary:
            Recognition is a complex non-linear inferential process that allows us to understand the information we encounter.  Induction is the inferential processes that expand knowledge in the face of uncertainty. [1] This is recognition.  The basic process of recognition is similar in both its formal and informal varieties.  The main difference is that the formal process requires a standardized methodology and subsequent verification.  Verification of recognition requires a scientific analysis of the hypothesized product.  
 
The benefit of narrowing the recognition process to a scientific method facsimile allows for uniformity in the process, thus minimizing false positives and false negatives.  Different levels of the recognition process must be treated differently.  The informal process can simply and effectively rely on experienced evaluation, and probabilistic contexts alone.  This contrasts to the approach taken by the formal process of recognition.   Whereas the formal approach structures the process in order to reduce subjectiveness, improve accuracy, and provide verification and/or practical proof of the recognition.
 
            The recognition process allows for three main results.  These include:
                        A.  Recognition
                        B.  Non-recognition
                        C.  Experience
 
Recognition’s product is the identification and individualization of information’s relevancies and values in context.  (Interestingly, the holistic information cannot be described as within a set.)  Likewise, non-recognition is the analysis of the information without its ultimate identification. Lastly, while not generally a goal in the recognition process, experience or familiarity with the contextual information can be considered a useful byproduct.  This experience byproduct is a key aspect of the process itself.  The simple analysis of information during the recognition process creates experience-based knowledge.  This allows for increasing returns on that information.  Increasing returns is a compounding network of informational facts that is subsequently used in future applications of the process.  However, this positive feedback or increasing return allows for both recognition and non-recognition.  
The product of the recognition process is dependent on the total information available as well as how well that information is understood.  Experience is necessary to understand information and that information in context.   This “context” is also an understanding of its value and its probability.  The additional accumulation of experience-based memory, including information gathered from the minor issue related recognitions and non-recognitions, further promotes efficiency in the process.   Increasing returns on our understanding of information drives the recognition process.  These returns are the information understood contextually and meaningfully within multifarious constraints.  The moment of positive recognition is founded on this relationship.   
 
            “It is worth noting that the issue of constraints arises not only with respect to inductive inference but with respect to deductive inference as well.  Deduction is typically distinguished from induction by the fact that only the former is the truth of an inference guaranteed by the truth of the premise on which it is based…” [33] However, researchers should also take a reverse view on the issues of induction.  A more thorough understanding of the recognition process will allow a deductive approach regarding research in the field of friction skin identification.  A more thorough understanding will also shed needed light on the undesirable phenomena of confirmation bias. Traditional approaches have involved inductive processes that would describe relevant levels of the recognition process directly related to the verification aspect of recognition.  New research in the ACEV methodology, used in the comparative analysis of friction skin, and some other forensic applications, attempts to further the understanding of the recognition process including the phase transitions of information.  Induction is the study of how knowledge is modified through its use. [34] Experience is that modified knowledge.  “Perception and recognition do not appear to be unitary phenomena but are manifest in many guises.  This problem is one of the core issues in cognitive neuroscience.”  [35] However, while it is unlikely consciousness will be “well” replicated in a future learning computer program we can expect it will be approximated.  If consciousness and its recognition process are not fully computable, and uncertainty is always to some degree pervasive, then ultimately, we must focus on the simple task of maintaining sufficient accuracy in our results with applied error reduction.  
            
Craig A. Coppock
20080827   Updated 20240306
 
 
References:
1.         Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 1
 
2.         J. A. Scott Kelso:1999 Dynamic Patterns: The Self-Organization of Brain and Behavior.
             Brandford Books, Cambridge p. 38
 
3.           Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge   p. 347
 
4.         Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference,
Learning, and Discovery. MIT, Cambridge p. 349
 
5.         Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 349
 
6.         Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 322
 
6a.        Livio, M; 2003 The Golden Ratio.  Broadway Books, NY p.240
 
6b.        Hofstadter, Douglas R.:  1989 Godel, Escher, Bach:  An Eternal Golden Braid,
            Vintage Books, NY p19
 
7.          Coppock, Craig: 2003, Differential Randomness and Individualization.
 
8.          Gazzaniga, M; Ivry, R; Mangun, G :2002: Cognitive Neuroscience; The biology of the 
mind. 2nd Ed. W.W. Norton & Co., New York p. 250
 
9.         J. A. Scott Kelso:1999 Dynamic Patterns: The Self-Organization of Brain and Behavior.
             Brandford Books, p. 21
 
10.        Gazzaniga, M; Ivry, R; Mangun, G :2002: Cognitive Neuroscience; The biology of the
      mind. 2nd Ed. W.W. Norton & Co., New York p. 201
 
11.        Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 10
 
11a.      Coppock, Craig; 2016 Information Theory’s Foundation to the Scientific Method.  Academic.edu and ResearchGate.com
 
12.         Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p .9
 
13.        Gazzaniga, M; Ivry, R; Mangun, G :2002: Cognitive Neuroscience; The biology of the 
mind. 2nd Ed. W.W. Norton & Co., New York p.  246
 
14.        Gazzaniga, M; Ivry, R; Mangun, G :2002: Cognitive Neuroscience; The biology of the 
mind. 2nd Ed. W.W. Norton & Co., New York p. 247
 
15.        Gazzaniga, M; Ivry, R; Mangun, G :2002: Cognitive Neuroscience; The biology of the 
mind. 2nd Ed. W.W. Norton & Co., New York p. 251
 
16.         Pelli, Denis G. (2000) Close Encounters:  An Artist Shows That Size Affects Shape. 
             The Best American Science Writing 2000, Ecco Press, New York.
 
16a.      Gazzaniga, M; Ivry, R; Mangun, G :2002: Cognitive Neuroscience; The biology of the
 mind. 2nd Ed. W.W. Norton & Co., New York p. 532
 
17.        Wallace, Alfred Russell (1898), The Wonderful Century its Successes and its Failures
 
18.        Asimov, Isaac (1974) The Left Hand of The Electron; Dell, New York, p. 190
 
19.        J. A. Scott Kelso: Dynamic Patterns: 1999 The Self-Organization of Brain and 
Behavior. Brandford Books, Cambridge p. 25
 
20.         Pelli, Denis G. (2000) Close Encounters:  An Artist Shows That Size Affects Shape. 
             The Best American Science Writing 2000, Ecco Press, New York.
 
20a.      Encyclopedia Britannica 2003 Deluxe Edition CD-Rom, Paradoxes and Fallacies; Paradoxes of Zeno
            Macintosh Edition
 
20b.      Barrow, John D. (2005) The Infinite Book; Pantheon, New York, p.241
 
21.        Waldrop, M. Mitchell. (1992) Complexity: The Emerging Science at The Edge of Order 
And Chaos; Penguin Books, New York. p. 39
 
22.        Gleick, James (1987) Chaos: Making A New Science.  Penguin Books, New York
 
23.        Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 4
 
24.         Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 243
 
25.        Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 249
 
26.        Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 250
 
27.        Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 19-20
 
28.        Waldrop, M. Mitchell. (1992) Complexity: The Emerging Science at The Edge of Order 
And Chaos; Penguin Books, New York. p. 82
 
29.        Waldrop, M. Mitchell. (1992) Complexity: The Emerging Science at The Edge of Order 
And Chaos; Penguin Books, New York. p. 230
 
30.        J. A. Scott Kelso: Dynamic Patterns: The Self-Organization of Brain and Behavior.
       Brandford Books, Cambridge p. 26
 
31.        J. A. Scott Kelso: 1999 Dynamic Patterns: The Self-Organization of Brain and Behavior.
             Brandford Books, Cambridge p. 18
 
32.        Hofstadter, Douglas (2000) Analogy as the Core of Cognition
The Best American Science Writing 2000, Ecco Press, New York.
 
33.        Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 4
 
34.        Holland, J; Holyoak, K; Nisbett, R; Thagard, P;1986 Induction: Processes of Inference, 
Learning, and Discovery. MIT, Cambridge p. 5
 
35.        Gazzaniga, M; Ivry, R; Mangun, G :2002: Cognitive Neuroscience; The biology of the 
mind. 2nd Ed. W.W. Norton & Co., New York p. 193