The Encryption Forensics Trail Assessment

If someone uses the PKI infrastructure to send secure emails, what type of forensic evidence will be left behind?

If someone uses the Public Key Encryption (PKI) infrastructure, professionals can obtain only digital evidence that reveals encrypted traffic and the public key used by two parties involved in the communication of some data (Tubewar, 2010). However, the private key that allows to find out what information was transferred will not be obtained as it is usually never sent from one person to another and remains hidden (Itfreetraining, 2013a). Still, this information is critical for the professionals as it is the only thing that allows them to find out what was written.

If someone accepts or associates a certificate to view a website in a browser, what type of forensic evidence will you find?

If someone accepts or associates a certificate to view a website in a browser, forensic investigators get a chance to gather a lot of helpful information. In this way, it will be possible to find out which authority issued it. Moreover, there is an opportunity to get to know who owns this certificate. As a result, the professional can define when it was used and if it was used by the same person to whom the certificate was issued. An expiry date can be found as well as the private code utilized by the user and digital signature (Itfreetraining, 2013b).

Can a subject of an investigation be compelled to provide a password for an encrypted file?

A subject of an investigation may not be compelled but asked by a professional to provide a password for an encrypted file. Of course, people’s private information is protected by The Fifth Amendment; however, the investigator can ask one to provide not only the passwords but also the keys during the interview. As no key disclosure laws exist in the USA, this information cannot be demanded. As a result, a subject has an opportunity to “forget” to give this data. The court has a right to repeat the request and sometimes even to order one to provide the keys, insisting on incarceration in case of reluctance to obey. Still, such a situation does not appeal to the court, and the forces are often put at “cracking” the code after hearing (Wolfe, 2003).

How can a subject of an investigation avoid providing a password?

As it was mentioned previously, the subject of the investigation is not obliged to provide a passport if one is not willing to. When having an interview, one rarely refuses to give the information directly; this person can alter the topic of the conversation when being asked about the password and then just do not refer to this question again. It would look as if the subject just forgot that he was asked to reveal this data. The suspect can just say that he/she does not remember the keys at all. One is also able to keep silent and hide information (Wolfe, 2003).

What are six alternatives (identified in the Wolfe paper) for obtaining a password without a subject’s cooperation?

In his paper, Wolfe (2003) states that the information can be obtained without the subject’s cooperation. He claims that forensic investigators can crack the code if it is weak enough. They have an opportunity to use a dictionary search for the potential key, as a suspect system is very likely to include this information. It can also be advantageous if professionals gather all available information about the subject. Rather often, people use things familiar to them when they need to create a password not to forget them. In this way, search for a badge number can be a good alternative. As legislation differs in various locations, forensic investigators can target encryption software from where it is totally legal. Finally, products backdoor-access can be used to save the situation.

What technique was used by the child pornographer discussed in the Wolfe article to finally obtain his password?

In the case discussed in Wolfe’s (2003) article, professionals considered that they could crack the code, but then this decision was altered. As a result, they successfully utilized the third alternative method and found out the password when gathering the information about the suspect (the badge number).

STARR, DIRT, and Magic Lantern offer a disturbingly intrusive surveillance capability. Explain what that is, and offer an opinion on whether or not you think somebody from another country might be using a similar capability against you?

Various software programs and hardware tools provide an opportunity for disturbingly intrusive surveillance capability, which means that they allow to “enable the capture of passwords and/or encryption keys” (Wolfe, 2003, p. 390). They can be used to track one’s actions and control them by logging keystrokes, for example, and gathering received data. Personally, I believe that people from other countries are not likely to use a similar capability against me, as such tools often require warrants and authorizations. Being discovered, this “investigator” may face legal issues, as the laws of both countries can be considered.

What is the purpose of KeyKatch and KeyGhost, and under what circumstances would they be used?

KeyKatch and KeyGhost are tools that can be used for surveillance. They are hardware-based programs that provide an opportunity to log keystrokes. They are often used to receive passwords. Investigators can substitute the suspect’s keyboard with the one that has KeyKatch or KeyGhost to get the keys when they are typed. These tools can even be used in organizations during security assessment (Simpson, Backman, & Corley, 2013).

How is a smart card (CAC Card) used in the PKI?

The Common Access Card (CAC) card is used for security purposes. Its usage proves that the individual has a card and knows the PIN. As a result, the process of authentication is maintained rather fast. The card requires up to three PKI certificates. Generally, the one needed for identification is used. Sill, depending on the tasks, signature, and encryption ones can be needed (Technology Assistance Center, 2007).

References

Itfreetraining. (2013a). . Web.

Itfreetraining. (2013b). . Web.

Tubewar. (2010). . Web.

Simpson, M., Backman, K., & Corley, J. (2013). Hands-on ethical hacking and network defense. Boston, MA: Cengage Learning.

Technology Assistance Center. (2007). CAC/PKI user’s guide. Web.

Wolfe, H. (2003). Encountering encryption. Computers & Security, 22(6), 388-391.

The Fields Data Recovery: Forensics Analysis

The description of the company

I would like to start a company that would specialize in such branches of computer forensics as data recovery and prevention of data loss. I have chosen this specific activity because these services may be required by both governmental and private organizations. On the whole, data loss is one of those problems faced by many customers, and many of them may ask for our assistance. Data recovery is essential for business continuity, and those enterprises which suffered from this problem spare no costs in order to restore valuable information. The company, which I intend to start, will help clients to restore information that has been damaged, erased, corrupted or made inaccessible in any way. We will work with various data storage media such as hard-disk drives, digital versatile disks, compact disks, flashcards, and even floppy disks. We will do both hardware recovery and non-hardware recovery. At this point, I can refer to Fields Data Recovery that is a good example of the firm that I would like to open or supervise.

The type of work it does

This company has worked in this field for eighteen years; they recover data from a large variety of storage media, for example, desktop and laptop hard drives, CDs DVDs, backup tapes, MP3 players, NAS, SAN and so forth (Fields Data Recovery, 2010 unpaged). The offices of this firm are located across the United States and in different regions of the world, in particular in France, Middle East (Dubai) and the United Kingdome (Fields Data Recovery, 2010 unpaged). They address the needs of various customers: 1) governmental and federal agencies; 2) private businesses; 3) healthcare organizations and 5) financial institutions. Furthermore, it should be mentioned that the employees of this data recovery company are very knowledgeable about storage media, produced by various manufactures such IBM, Hewlett-Packard, Cisco, Seagate and so forth (Fields Data Recovery, 2010 unpaged). In fact, these corporations are the key clients of Fields Data Recovery, and the management of this company has established long-term partnerships with them. Fields Data Recovery Guarantees complete confidentially of the inside information such as IP addresses, telephone numbers, or emails. Thus, this is the type of company which I would like to run. The major success factors for this type of organization are speed, efficiency, and confidentiality.

Equipment required for the lab

At this point it is necessary to determine what kind of technologies will be required for these processes. In this context, the word “technologies” means both equipment and software. In the majority of cases, data recovery companies do not provide information about those technologies which they use, and Fields Data Recovery is not an exception from this rule. The main reason for such secrecy is that they do not want to lose competitive advantage over other firms. Therefore, one can only deduce what kind of tools they apply.

If we are speaking about recovery equipment, we need to mention EnCase and FTK that are suitable for such operation systems as Windows (Casey, 2004, p 264). In turn, Unix-based recovery is based on the use of such tools as the Sleuth Kit and SMART (Casey, 2004, p 264). They are equally suitable for both recovery and analysis of the information. As a rule such toolkits are called hardware-soft-ware complexes. The equipment also includes portable disk imaging devices, allowing to create a single file that contains the entire contents of hard-disk drive or any other storage medium (Casey, 2004).If the hard disk drive has been physically damaged, the technicians must use such devices as magnetometers and soldering irons in order to retrieve files that no longer be restored only by means of software solutions. This is the equipment that they should always have close at hand.

However, one should bear in mind that data recovering companies should also procure different spare parts of a storage medium. Very often, they have to remove hard-disc platters or read-and-write head. So, these organizations must have a regular supply of these component parts to be able to do hardware recovery as quickly as possible. This is one of the reasons why they need to establish continuous relations with the leading manufacturers of storage media such as IBM, Seagate or Transcend. Moreover, they need to know which of these products is more prone to physical or logical failure. This case illustrates the idea that in order to estimate the technological needs of a data recovery company, one has to possess information about the mainstream IT manufacturers and their quality standards. Fields Data Recovery keeps track of the quality standards, set in various companies. This information is partially disclosed at their official website (Fields Data Recovery, 2010).

Software solutions

As far as software solutions are concerned, we can list a large number of programs that may be of great use to companies like Fields Data Recovery. They are as follows: 1) undelete tools; 2) recycle bin replacements; 2) CD rollers, 3) image recall tools (Cross & Shindler, 2008). The employees of such firms use bootable software which is particularly beneficial when the operating system has failed, for instance, boot disks or live USB (Cross & Shindler, 2008, p 314). They also need to apply different consistency checkers that are compatible with different operating systems like Windows, Unix, or Mac OS. Data recovery companies must also possess a large variety of file recovery and repair programs. The functioning of data recovery companies is also impossible without usage of file carving software that enables reassembling the damaged files from separate fragments (Casey, 2004, p 306). Thus, it is possible to argue that the founders or supervisors of a data recovery company need to purchase a great number of software solutions if they want to address their customers’ problems as soon as possible.

One should bear in mind that the development of web-based technologies has provided data recovery companies to operate online. In such scenario, the technician does not actually to gain physical access to the hard-disk drive (Cross & Shindler, 2008). While discussing software solutions, I should say that Fields Data Recovery only restores the damaged information and it seems to me that such company may also provide services for the prevention of data loss. For example, they can install DLP networks, host-based DLP systems, battery backups, and journaling file systems. This type of software may also be required for such firms.

This discussion shows that a data recovery company can provide a great variety of services to a large number of customers, with whom they may establish long-term relations. Nonetheless, in order to open such a business one has to carefully analyze technological needs of such organizations.

Reference List

Casey E. (2004). Digital evidence and computer crime: forensic science, computers and the Internet. NY: Academic Press.

Cross M. & Shindler D. (2008) Scene of the Cybercrime. London Syngress.Fields Data Recovery. (2010). The Official Website. Web.

Tech: Database Forensics

Abstract

Data processing involves several activities designed to transform the raw data into a more sophisticated and usable design. Among the activities involved in the process of data processing is data storage. In the process of data storage, there are many places where some part of data can be stored. While these data are stored, there are instances when the data become manipulated or tampered with, and in the process lose the actual meaning of its storage.

Introduction

Database forensic is a specialty equivalent to computer forensic. Like computer forensics, the discipline pursues the typical forensic procedures and makes use of the exploratory practices to the database. In our today’s society, information has so far been claimed as one of the very essential commodities and there are endless questions on what the society without information would look like. The majority of this information is composed and processed in a database. This piece of writing will have an in-depth analysis of database forensics, its obligation in database analysis, and other major results associated with it.

Forensic outcome and result

According to Lucy (2005), almost every single institution is in one way or another connected with data that may involve clients or patient information. Lucy further remarks that these data are usually entrusted to various relevant data protection for safekeeping. The outcome and information obtained from database forensic can be used for several reasons according to Brinkmann & Carracedo (2003).

Brinkmann & Carracedo (2003) observes that one argument in which the outcome of database forensic can be of greater importance to a company is the fact that a company can be able to determine whether the integrity of the company’s data has been compromised, or whether one way or another has violated the users’ privacy. Lucy notes the two reasons are the core of any successful company. This is because the end users’ information usually forms the pillar on which the success of any company is built. By carrying out this analysis, the company will be conveying the message that it ought to verify if its database has been tampered with.

Butler (2009) observes that the outcome of database forensic will equally go a long way into developing a valid file system inclined to the database forensic. Butler easily recognizes that developing modern files, which are not, common with database forensic has made the whole exercise very difficult to analyze as most of the database forensic tools used, do not support them. Mozayani (2010) in a similar response echoes the same sentiments as Lucy and points out that by embracing database forensics, the company will also be able to comprehend and analyze any instance of attack that may have been orchestrated toward its database.

These results will therefore to a great point equip the company with expertise on the susceptibility exploited, therefore, in the process raise the issue of escalating deterrent countermeasures. Mozayani (2010) in addition, views that to analyze data it is pressing to know and understand in-depth how the database is structured. The author remarks that this will simplify the work of analyzing the data and therefore, take less time on the work of the forensic team.

A forensic analysis of the database may center on recognizing operation within a database system or benefit that point out verification of wrongdoing, for instances occurrence of fraud (Graham 2010). In this way, Graham observes that, if the practice is carried out expertly, it will lead to light several fraud cases within a company and in the system either avert the potential occurrence of such case in point or make available the extent of damage caused by such deception.

Shoester (2006) points out that some programs can be used to exercise and analyze data. The software as well makes available audit classification expertise that present acknowledged substantiation of what role or analysis a forensic analyst carried out on the database. According to Shenoi & Peterson (2009), the use of these programs to verify the extent of analysis carried out on the database by a forensic expert performance as a means of validating that database forensic was carried out.

Saitoh & Franke (2009) notes that most of the software and forensic tools used nowadays are not very reliable and precise enough to be used in forensic work. These authors attribute this fact to too little effort shown in researching the topic an issue that has left very few books on the topic being published. Similarly, Saitoh & Franke (2009) insist on the need for more delve into the topic of forensic and mostly that of data.

Conclusion

Database forensic has been reflected from the above article as an exercise that is still in development. From the way, the exercise is carried out on the database to identify instances of tampering with them, to the use of database outcome and results in making important decisions about the safety of the company’s database the exercise is such a significant one. From the above information, one cannot fail to observe the relationship between database forensic and the development of technology. It is in the hypothesis that one realizes there is still much to be done if forensic is to be useful and generally at the rate at which technology is advancing.

Reference list

Brinkmann, B & Carracedo, A (2003) Progress in forensic genetics 9.

Butler, J (2009) Fundamentals of Forensic DNA Academic Press: Washington.

Graham, I (2010) Forensic Technology, Evans Brother’s publishers: London.

Lucy, D (2005) Introduction to statistics for forensic scientists, John Wiley and Sons: New York.

Mozayani, A (2010) the Forensic Laboratory Handbook Procedures and Practice, Springer: New York.

Proceedings from the 19th International ISFG Congress held in Munster, Germany. 2001, Elsevier Health Sciences publishers: New York.

Sako, H, Franke, K & Saitoh, S (2011) Computational Forensics: 4th International Workshop, IWCF 2010 Tokyo, Japan 2010, Springer: New York.

Shenoi, S & Peterson, G (2009) Advances in Digital Forensics V: Fifth IFIP WG 11.9 International Conferences on Digital Forensics, Orlando, Florida, USA, 2009, Springer Publishers: New York.

Shoester, M (2006) Forensics in Law Enforcement‎, Nova publishers: Durban.

Forensic Investigation of Oil-Contaminated Concrete Structures

Oil spills and leaks are frequently occurrences in Civil Engineering projects, particularly in structures that have to deal with storage, transportation, and processing of oil products. Oil contamination could occur as a result of a car accident, oil-containing equipment failure, or deliberate actions from others. This paper will analyze the effects of oil contamination on concrete, outline the tools for a forensic investigation of the material, and provide several repair and remediation methods.

Effects of Oil Contamination on Concrete

Different research vary in their accounts of how much oil contamination affects a concrete element. Błaszczyński (1996) states that the effects of crude oil and its products are classified by the majority of the sources available as either non-harmful or mildly-harmful. Diab (2011) disagrees, saying that the oil spillage underneath machineries is the primary reason for the deterioration of reinforced concrete platforms. The accounts on the precise effects of oil spilled onto concrete platforms or contaminating the mix during production vary: some researchers found that in some cases the effects can be even beneficial – oil-contaminated concrete can be more resistant to freezing and thawing effects (Diab, 2011). At the same time, contamination of the mix with 5% or more oil makes concrete lose up to 50% of its compressive strength (Diab, 2011). Błaszczyński’s (1996) account of the long-term effects of oil contamination on the durability of concrete states that, in general, an oil-contaminated concrete behaves like a new material, with reduced stress-strain parameters.

Forensic Investigation of Contaminated Concrete Elements

Visual examination of concrete may reveal places of contamination due to the fact that most oils tend to get into the pores of the material and are increasingly hard to be removed (Francois et al., 2017). This method, however, will not completely examine the extent of damage beside the fact that oil contamination had already occurred. There are several methods for inspecting the oil contamination based on the age of the element. If the contamination happens during the construction of the concrete element obtaining a sample cube during construction, which is normally done to control the compressive strength of the material. If there has been a contamination of the mix with oil, compressive strength be reduced.

For structures that have already been built and become contaminated, the standard procedure involves the drilling and extracting of concrete cores to use for testing (Francois et al., 2017). This destructive method is used to show the depth of corrosion and its effects on concrete strength. A non-destructive evaluation method that could be used on-site is the Ultrasonic Pulse Velocity (UPV) method, which can detect various damages in a structural component (Francois et al., 2017). Since oil penetration may result in compressive strength reduction, the concrete element becomes more susceptible to damage. However, it does not provide exact compressive strength parameters and can be affected by rebar, voids, and cracks.

Repair and Remediation Methods

Repair and remediation methods depend on the depth of the contamination and the length of exposure. The standard protocol for spillage contaminations includes cleaning and sealing the contaminated elements to prevent floor failures (Francois et al., 2017). Chemical solutions are used on the mechanically-prepared concrete surfaces and cleaned under high pressure and temperature. This is done to bind the hydrocarbons with water and remove them with vacuum suction. In the cases where contamination has gone too deep, an entire surface layer may have to be removed (Francois et al., 2017). Should the contamination had happened during the creation of the mix, it may be the grounds for removal and replacement of the entire structure.

Conclusions

Though oil spillage was considered harmless to concrete before, new evidence shows detrimental effects of oil contamination on hard concrete and concrete mixes. When introduced to a concrete mix results in a concrete mix reduction. In some instances, the addition of oil may improve resistance to freezing and thawing. Test methods include visual examination, sample cube tests, concrete core examination, and the UPV method. Restoration methods include chemical cleaning under high temperature and pressure or full replacement of the contaminated area.

References

Błaszczyński, T. Z. (1996). Concrete in contact with crude oil products. Statyba, 2(6), 13-17.

Diab, H. (2011). Effect of mineral oil on reinforced concrete structures Part I. Deterioration of compressive strength. JES. Journal of Engineering Sciences, 39(6), 1321-1333.

François, R., Laurens, S., & Deby, F. (2018). Corrosion and its consequences for reinforced concrete structures. Elsevier.

Chemical Spills in Forensic Setting

A chemical leak is characterized by discharge of a chemical compounds. It commonly occurs during haulage, storage, handling and disposal of compounds with chemical characteristics. Spilling of chemicals is a frequent incident in many labs.

Necessary steps to clean the spill are extremely imperative due to their perilous nature. Chemical spills may cause immediate peril to the life and wellbeing of a person (St. Clair, St. Clair & Given 231). Reasonable steps of spill preclusion and availing sufficient resources to clean up unintentional spills are indeed critical.

The chemicals may be combustible, noxious, acerbic, reactive or volatile, and may lead to difficulties in inhalation, skin problems and blindness. Necessary steps to stop chemical spills should always be in place for the staff to know proper work practices while handling chemicals (Horswell 32). The containers must be made using unbreakable materials. Compound wastes should be eliminated separately, preferably through incineration.

There are certain procedures outlined in laboratories to counter the spill of chemicals. Everybody in the laboratory should wear shielding specs, gloves and lab coats after being informed of the spill (St. Clair, St. Clair & Given 248). The spill must be constrained to a small region and cleaned right away. If it is combustible, all sources of heat should be instantly turned off; furthermore, materials used in the clean up should then be disposed properly and labeled as hazardous.

Labs work with a set of chemicals in different conditions. The storage, decanting, moving and discarding of chemicals therefore require strict procedures during handling. Chemical containers should be easy to retrieve, and be kept higher than the eye plane. Chemicals containers should be inspected for leaking, and old ones be regularly replaced.

They should be transported in secure cans which must be fixed firmly. Laboratory devices should always be inspected for cracks before they are used in handling chemicals (St. Clair, St. Clair & Given 248). It is not forgotten that the position of a chemical leak kit should always be known before one uses chemical compounds. Incompatible wastes must never be mixed together while disposing chemical materials.

OSHA protocol

The OSHA protocol requires new and established employees to be trained on health and safety in their careers. While elimination of risk is impossible, necessary risk-reduction measures must be taken. There are specific terminologies to identify lab employees who have met certain specific training requirements. Pictorial representation is preferred over written work, and simple guidelines are given to encourage cooperative, deliberate safety and health programs in the lab (Moran 9).

In the forensic labs, rudiments of environmental wellbeing and safety should be strictly considered. Proper coordination in the laboratory is important because the mistake of one individual may compromise the safety of others (St. Clair, St. Clair & Given 229). Productivity should never be put before the safety of the individuals in the laboratory. The required quantity of chemicals should be used in crime scenes, and excess spillage must be cleaned.

The analysis of drugs and chemicals which may be used in diverse crime scenes are handled in the forensic science laboratory (Horswell 32). Samples of chemical supplies recovered should be placed in labeled envelopes which are resistant to corrosion. Forensics involves labs bursting with digital activity, high-tech apparatus and visiting of crime sites. The safety regulations observed in the labs is not different from those followed in typical labs (Petraco and Kubic 230).

Care must always be taken while in the laboratory to shelter everybody from harm. Necessary lab gear should always be in place, and equipment to deal with chemical spills must be at hand. The main priority is ensuring individual safety, and most importantly, that of other users of the laboratory. The OSHA protocol emphasizes on training which every employer must subject its workforce to in order to ensure both wellbeing and security in the lab.

Works Cited

Horswell, John. The practice of crime scene investigation. London: CRC press, 32-34

Petraco, Nicholas. & Kubic, Thomas. Forensic science laboratory manual and workbook. London: CRC press, 2005 220-240

St. Clair Jami, St. Clair Michael & Given, Jo. Crime laboratory management. London: Academic press, 2003. 231-260

Moran, Mark. The OSHA training answer book (2nd edn). Florida: Safety certified Inc, 2008 9-140

Forensic Applications of Electron Spectroscopy for Chemical Analysis

Background

Spectroscopy is an example of a forensic technique used to detect, in many cases, determine the composition and the quantity of a certain compound in a sample(Siegel 1997). Spectroscopy is defined as the measurement of the absorption, emission, or scattering of electromagnetic radiation from a surface by atoms or molecules (Siegel 2004, p. 2326). Absorption is the transmission of electromagnetic energy to the atoms or molecules; emission is the transition of electromagnetic energy from one energy level to another resulting in photon emission and scattering is when light is redirected when it interacts with matter (Skoog, Holler, & Nieman 1998).

Principle

When a primary X-ray beam of precisely known energy impinges on sample atoms, inner shell electrons are ejected and the energy of the ejected electrons is measured. The difference in the energy of the impinging X-ray and the ejected electrons gives the binding energy (Eb) of the electron to the atom. Since, this binding energy of the emitted electron depends on the energy of the electronic orbit and the element it can be used to identify the element involved. Further, the chemical form or environment of the atom affects the binding energy to a considerable extent to give rise to some chemical shift, which can be used to identify the valence state of the atom and its exact chemical form. This technique is mostly referred to as Electron Spectroscopy for Chemical Analysis (ESCA). ESCA is a well-established surface spectroscopy technique to provide effective chemical analysis of surfaces in given specimens. The method is of valuable use when elements of low atomic number and surface impurities are considered.

The method is complicated and requires relatively high vacuum (UHV) conditions. The UHV conditions are typically below 10 to the -9 millibar. The atmospheric pressure is just 1 bar and therefore at UHV conditions the number of atoms is 1/1,000,000,000,000 that of air per unit volume. This condition is highly recommended for surface analysis as the molecules in the air will land to the surface and change the surface property. The atoms are attached to the surface in 3 seconds and cannot be analyzed since this time is not enough to experiment. The UHV conditions are required to provide several hours for the experiment to be completed. The UHV chamber is first prepared by pumping 10 to the -2 millibar using a rotary pump. A turbomolecular pump is then used to pump 10 to the -6 millibar. The vacuum chamber is enclosed in a tight oven and heated to about 180 degrees Celsius. The baking takes place for two days to remove any gas inside the chamber and it is then allowed to cool down. This attains the UHV chamber at room temperature. The output (results) is analyzed by identifying characteristics peaks as well as analyzing the peak areas. The instruments are linked to computers to assist in data analysis. The process of adhesive encompasses the surface condition and composition. ESCA or XPS has provided forensic scientists with an important tool to recognize the causes and location of failure.

Introduction

Electron spectroscopy for chemical analysis (ESCA) also known as x-ray photoelectron spectroscopy was first explained in 1905 by Albert Einstein (Wagner, et al. 1979). Einstein demonstrated that the energy of an electron ejected in the photoemission is equivalent to the difference between the energy of the target photon and its binding energy. The rapid development of ESCA was actualized when researchers measured the binding energies of the core electrons by x-ray photoemission and realized that these binding energies could vary as much as 6 eV depending on the chemical state of the atom. This technique provides important information about the chemical effects on surfaces making it an extremely surface-sensitive technique (Sibilia 1988). Other spectroscopy techniques such as Atomic emission spectroscopy (AES), Atomic absorption spectroscopy (AAS) among others interpret absorption or emission as a function of energy whereas ESCA measures the kinetic energy of the electrons which are ejected by x-ray radiation (Seah & Briggs 1992).

The ESCA is performed using a synchrotron-based light source that is combined with a custom-designed electron analyzer (Siegel 2000). The most highly used ESCA system consists of 20-200 micrometer beams of monochromatic (aluminium K-alpha x-rays) or polychromatic (magnesium alpha x-rays). The monochromatic beam of x-rays is produced by diffracting and focusing a polychromatic beam of x-rays using a thin disc of naturally occurring crystalline quartz with a <1010> lattice. The resulting wavelength of 0.8339 nm corresponds with the photon of 1486.7 eV. These monochromatic x-rays have a diameter of 0.16 eV but the ESCA system has a high energy resolution of about 0.25 eV. On the other hand, the polychromatic magnesium x-rays produce a wavelength of 0.99 nm which corresponds to the photon energy of 1254 eV. This non-monochromatic x-ray is approximated as 0.70 eV which is the energy resolution of the system when using this type of a beam. In addition, crystals are not used to diffract the x-rays and this allows all the primary x-rays and high-energy Bremsstrahlung x-rays to reach the surface (Ewing 1985).

ESCA is widely used in forensic laboratories. It is mainly used for surface analysis, especially in the qualitative identification of the elements in a sample. Based on the chemical shifts, the chemical environment around the atoms can also be estimated. This measurement is useful in determining the valence states of the atoms present in various moieties in a sample. Quantitative measurements can be made by determining the intensity of the ESCA lines of each element.

Instrumentation

It consists of a radiation source for primary X-rays, monochromator, an energy analyzer (to resolve the electrons generated from the samples by energy) and a detector to measure the intensity of the resolved electrons. The analysis is done in high vacuum (Kuwana 1980; Ingle & Crouch 1988). Figure 1 below shows a schematic diagram of the ESCA system.

Figure 1: Instrumentation.

The ESCA system is routinely used to analyze a wide variety of compounds such as inorganic, polymers, elements, make-up, teeth, bones and many others (Schroder, Muller & Arndt 1989). The surface composition, usually 1 to 10 nm, is analyzed to find its empirical formula, elements that contaminate the surface; line profiling that analyzes the uniformity of elemental composition across the top surface as well as the depth profiling (Levy & Lemeshow 1991). The photoelectrons emitted from the sample are counted at each kinetic energy value and a spectrum is generated from equation 1. Since the binding energy of an electron is its chemical characteristic of the specific element, orbital, ESCA is also used in determining the bonding state and the surface concentrations.

Methods

The (XPS) also known as ESCA uses the concept of photoelectric effect to detect the chemical composition of a surface. Some special ESCA systems have been designed to analyze volatile gases or liquids, samples at different temperatures and at approximately 1 torr vacuum. The fact that the energy of a particular x-ray wavelength is known, the electron binding energy (BE) of each emitted electron can be determined using the following equation that is based on the work of Rurherford;

Ebinding = Ephoton – Ekinetic – φ (1) where Ebinding is the energy of the electron emitted from one energy level to another within an atom, Ephoton is the energy of the photon used, Ekinetic is the kinetic energy of the emitted electron measured by the electron and φ is the work function of the instrument. The purpose of a work function is to provide extra energy used to transfer the electron from the specimen surface to the vacuum area. Each instrument has its own predetermined value.

Fundamentally, there are three stages or activities involved in the analysis of sample. Firstly, the sample is collected from the source. This is followed by preparation of the sample which may include techniques like culturing, cleaning or soaking and thirdly using appropriate methods for analysis. These techniques are independent of each other but one can have an effect on the other. In analysis, the error is ostensibly possible and the person analyzing should be alert to identify these errors and avoid them. The method of analysis to be used should address the level of accuracy, sensitivity and dependability. The scientists should therefore evaluate the method at hand and decide whether it is acceptable or not.

Sample preparation

The solid samples to be analyzed should be 0.1-4.0 cm2 and with very low vapor pressures (Yacobi, Holt & Kazmerski 1994). The samples are first analyzed and the surface of 20Å is removed by light sputtering (known as depth profiling). This guarantees that the surface is not contaminated by any organic or incidental impurities. In forensic analysis, a quality assurance program must monitor the testing procedural and make certain that testing techniques applied and results reported pass the proficiency and auditing test. The guidelines are found in the trace evidence quality assurance guidelines (TEQAG).

Sample analysis

In ESCA, the sample is irradiated with a monochromatic beam of x-ray. The core electron(s) of the sample are emitted by the well-known concept of the photoelectric effect. The concept is explained by the fact that when a metal surface is irradiated with light, electrons are ejected. The emitted electrons are detected using a cylindrical mirror analyzer (CMA) detector. The kinetic energy given is then converted to binding energies thus allowing the identification of the present elements. The resulting energy spectra are given as binding energy versus intensity. The peak intensities may give valuable quantitative information about the elemental surface compositions if sensitivity factors are employed. ESCA or XPS is widely used in forensic laboratories to analyze adhesive joints and obtain compositional information on the specimen surfaces. This is particularly useful in the identification of contaminants that may have caused the failure, non-bonded areas, and lack of adhesive. In addition, it also checks whether the failure is on the adhesive or original material or at the interface.

Results

The ESCA spectrum provides information about the surface of the sample as well as the oxidation state (chemical environment) of the element. The strength that binds an electron is influenced by the chemical environment surrounding it. Such atom with dissimilar chemical environment generates peaks with different binding energies which are referred to as chemical shift. Moreover, separate chemical states disentangled using peak fitting programs to give the percent composition of an individual state. While reporting the laboratory results, uniformity and consistency should be observed. The format, accepted units of measurement and calculation should be clearly documented in the laboratory manuals. All the persons involved in the investigations should be able to understand the results and interpret them giving the significance very clearly. In addition, laboratory results should be ISO (International Organization for Standardization) compliant. This body requires that the reported information be concisely clear, of high accuracy and instantly recognizable in the presentation of results.

Advantages

Basically, both the ESCA technique and the Auger Electron Spectroscopy (AES) are surface analysis techniques. The CMA detector is used to analyze the ejected electron in both techniques. In addition, the ESCA scan provides similar or comparable information to AES. However, the ESCA have a number of advantages over AES and this have recently made ESCA a good complementary method to AES in forensic analysis of biomolecules. For example, the AES may give destructive electron bombardment to some materials which on the contrary are not affected by ESCA monochromatic x-ray bombardment beam. Further, the electron bombardment in AES has a problem of charging the insulative specimens which leads to poor analysis. The charging up of specimen is never experienced in ESCA neutral analysis.

The resolution energy of ESCA is typically better than that of AES. This is particularly useful in detecting of binding energies of atoms in a molecular structure. Thus information on elemental composition as well as chemical bonding is given. ESCA has widely been applied in the semiconductor industries to study the surfaces of organic, oxides and polymers. It has also been employed in development of plasma etching techniques.

Limitations

The ESCA technique is unable to detect the hydrogen atom or helium atom. In this technique, it is not possible to make the x-ray beam diameter smaller than that of electron beams. This inadequacy have rendered the spatial resolution of ESCA to be poor than that of AES. This inherent weakness has also led to innovation of new technique known as imaging XPS (iXPS). The modern spectrometers with this feature offer the capability of parallel imaging. This however is important as it obtains information on positions from dispersion characteristics of the hemi-spherical analyzer. This further generates images with a high spatial resolution. In recent development on the iXPS a magnetic objective lens has been introduced and this on the other hand has demonstrated improvement on the sensitivity criteria for a given spatial resolution enabling the use of the new technique to expand. In addition, the accuracy of quantitative analysis is limited to approximately 4 percent. High vacuum is also necessary for the system to avoid the low energy electrons to be collided with other impurities, which may result in low sensitivity. It is not possible to detect the impurities at the ppm or ppb levels (Crist 2004).

List of References

Crist, B. V. 2004, Handbooks of Monochromatic XPS Spectra, Volumes 1-5, XPS International LLC, Mountain View, CA, USA.

, 2009, Encyclopedia Britannica [Online] Web.

Ewing, G. W. 1985, Instrumental Methods of Chemical Analysis, 5th edn, McGraw-Hill, New York.

Forensic Guidelines, 2000, Scientific Working Group on Materials Analysis (SWGMAT).

Garfield, F. M. 1991, Quality Assurance Principles. Association of Official Analytical Chemists, Arlington, Virginia.

Grant, J. & Briggs, D. (eds) 2003, Surface Analysis by Auger and X-ray Photoelectron Spectroscopy, Chichester, UK.

Ingle, J. D. & Crouch, S. R. 1988, Spectrochemical Analysis, Prentice-Hall Int., New Jersey.

Kuwana, T. 1980, Physical Methods in Modern Chemical Analysis, Vol. 2, Academic Press, London.

Levy, P. S. & Lemeshow, S. 1991, Sampling of Populations. John Wiley and Sons, New York.

Moulder, F., Stickle, W., Sobol, P. & Bomben D. 1992, Handbook of X-ray Photoelectron Spectroscopy, Perkin-Elmer Corp, USA.

Schroder, E., Muller G. & Arndt, K. 1989, Polymer Characterization, Hanser Publishers, Munich.

Seah, P. & Briggs, D. (eds) 1992, Practical Surface Analysis by Auger and X- ray Photoelectron Spectroscopy, Wiley & Sons, UK.

Sibilia, J. P. 1988, Materials Characterization and Chemical Analysis, 2nd edn, VCH Publishers, New York.

Siegel, J.A. 1997, ‘Forensic Chemistry’ in Macmillian Encyclopedia of Chemistry, Macmillian Company, New York.

Siegel, J.A. 2000, Collection and Chain of Evidence, Academic Press Limited, London.

Siegel, J.A. 2004, ‘Accreditation of Undergraduate and Graduate Forensic Science Education Programs’, Forensic Magazine, Vol. 1, No. 3, pp. 2326

Skoog, D. A., Holler, J. & Nieman, T. A. 1998, Principles of Instrumental Analysis, 5th edn, Thomson Learning, Crawfordsville.

Surface Chemical Analysis — Vocabulary, ISO 18115: 2001, International Organization for Standardization (ISO), TC/201, Switzerland

Wagner,C.D., Riggs, W., Davis, L. & Moulder, G. 1979, Handbook of X-ray Photoelectron Spectroscopy, Perkin-Elmer Corp., Eden Prairie, MN, USA

Yacobi, B., Holt, D. B. & Kazmerski, L. 1994, Microanalysis of Solids, Plenum Press, New York.

Forensic Anthropology in Criminal and Civil Law Context

Introduction

Forensic anthropology is a branch of anthropology that applies its specific scientific methods to criminal and civil law contexts. The scope of the discipline is the extraction of necessary for legal processes data by examining human remains. Forensic anthropologists usually work in a team with other specialists, such as medical examiners and law enforcement agents. The main aim of this applied science is to establish the identity of the dead person, as well as the cause of death and, if possible, it’s accompanying circumstances. Sometimes, the specialists are involved in processes requiring the identification of living persons; their methods are also helpful in examining the issues of historical mass disasters, for example, when working with mass burials.

Forensic anthropology brought the essential changes to crime solution. By examining physical human remains, it provided evidentiary materials of the crimes, previously unattainable. The discipline was institutionalized in 1972 by the founding of the Physical Anthropology Section of the American Academy of Forensic Sciences, and since that time has undergone considerable development. It is defined as an applied branch of biological anthropology (Kottak 2017, 13). The traditional methods of identification have been supplemented by new scientific techniques, which extended the capacities of science. In this paper, several methods used in forensic anthropology, such as facial approximation, DNA sampling, and decomposition research (body farm), will be investigated.

Facial Approximation

Identification of individuals requires the determination of their unique physical characteristics, such as face and body shape. Forensic anthropologists are usually involved in the legal processes where the recognition of the dead person is impossible due to injuries, dismemberment, or post mortal changes, in case of an extended period of time since the death. The facial approximation is the process of the construction of the possible facial image of a person using the configuration of the skull.

It is usually combined with other methods, such as craniofacial photographic superimposition and interpretations of surveillance (Ubelaker 2018, 277). In other words, it aims to reach the public for further identification using other methods such as comparing with photographs and recognition by other persons.

Applying the method of facial approximation often requires the specialist to combine the skills of the anthropologist with the knowledge and talent of the visual artist, or to involve in work two separate specialists. It was first attempted in Europe in the late nineteenth century when teamwork of anatomists and a sculptor resulted in the creation of an approximate facial image using tissue depth measurement (Langley and Tersigni-Tarrant 2017). Another way for the reconstruction of the face is the anatomical method introduced in the 1920s. However, the latter is more time consuming and required specialized anatomic knowledge; thus, at present, for the identification of the deceased forensic anthropologists prefer to rely on the former one.

As Langley and Tersigni-Tarrant (2017) states, the method of facial approximation has its limitations, as there are no credible methods to estimate distinctive features of the human face, such as eye color, ears, and lips, the form of the nose, and the color and length of the hair. However, the techniques of tissue measurement are being developed gradually, increasing the possibilities of the method.

DNA Sampling

During the last few decades, the development of the genetic engineering technologies caused the revolution in the applied techniques of forensic anthropology, in the modern era, DNA sampling was introduced as the way of the identification of the individual. In this field, the discipline utilizes the findings of molecular anthropology, which analyzes genomes and all organism’s DNA. Molecular analysis is able to provide “highly accurate information regarding the sex of the individual represented and provide positive identification (Ubelaker 2018, 75). Since DNA sampling is an expensive, time-consuming, and destructive technique, it is usually applied in cases when there are no other possibilities to identify the individual. However, as it provides factual evidence, this method is highly valuable and in the legal processes can influence the outcome of the case.

Decomposition Research (Body Farm)

The term “body farm” was introduced in 1994 by Patricia Cornwell, a member of the University of Tennessee. The study of the decomposition of the human remains demonstrated that there is a number of factors that influence the process of the decay: temperature, humidity, insect activity, the place of burial, and the presence or absence of clothes. Further, the effect of all these variables on soft tissue decomposition and hard tissue alteration was investigated.

As it was stated, the key factors in this influence are the temperature, and location (ground, surface, water); among the secondary elements are soil conditions, moisture, funerary treatment, and others. This information is necessary to “properly assess time since death (post-mortem interval) and post-mortem events related to criminal activity” (Ubelaker 2018, 276). The field still needs further investigation regarding the impact of various conditions on the process of the alteration of the human remains.

Conclusion

In summary, it must be argued that the use of anthropological methods in forensic science is highly resultative and helpful. Moreover, in light of the discoveries of contemporary science, the applying of new techniques brings new possibilities in this field. In accordance with modern tendencies, forensic anthropology illustrates the requirement for interdisciplinary collaboration and demonstrates its efficiency.

References

Kottak, Conrad. 2017. Window on Humanity: A Concise Introduction to General Anthropology. 8th ed. New-York: McGraw-Hill Education.

Langley, Natalie R., and MariaTeresa A.Tersigni-Tarrant, eds. 2017. Forensic Anthropology: A Comprehensive Introduction. 2nd ed. Boca Raton, Fl.: CRC Press.

Ubelaker, Douglas H. 2018. “Recent Advances in Forensic Anthropology.” Forensic Sciences Research 3 (4): 275–277. Web.

Forensic Procedures: Hairs and Fibres

The changes in time have been associated with increased levels of crime. The continued cases of robbery, murder and rape have been countered with the adoption of various forensic procedures to bring the crime perpetrators to book. These methods include DNA analysis, face recognition, forensic odontology, forensic pathology, use of finger prints, hairs and fibers, criminal profiling and serology (Jackson, 2002). This paper outlines how hairs and fibres play a crucial role in the investigation of crime.

Biologists have always associated hair to dead cells. Hair primarily made up of keratin is characterized by differences in growing rates (Ramstard, 2006). For instance the head hair grows at a slower rate compared to the beard but at a faster rate compared to the body hair. The fact that hair does not decompose gives an edge over other forensic methods. The durability aspect has been applied to investigate criminal cases that date back several years.

It has also been established that hair depicts an absorbent quality which makes it possible to analyse levels of arsenic poisoning (Ditton, 2002). It is usually easy to ascertain whether the hair obtained fell off freely or was as a result of a tussle between the victim and the suspect. Hairs can be subjected to DNA profiling on condition that some root structures are available (Ramstard, 2006). Presence of hair on surfaces of vehicles after an accident is reminiscent of accidents. It is also correct to conclude that a crime weapon bearing a victim’s hair was used against him/her. It is this principle that is used to correlate certain events to crime investigations (Kelly & Phillip, 2004).

Fibres are commonly regarded as the basic units of clothing. Fibers may be of different length, strength and pliability depending on the type of fabric desired. Artificial and natural fibers are common. Natural fibers such as wool and cotton have been in existence for many years but the emergence of synthetic fibers has been accepted due to changes in tastes and preferences. Synthetic fibers include silk, nylon and polyester (Reffer & Welzel, 2007).

Structural fibres such as asbestos and glass fibres have also been introduced lately. The identification and analysis of natural fibers was simple just by observation and touching. It is usually difficult to identify synthetic fibres because of how designers have combined several varieties in order to achieve desirable results (Reffer & Welzel, 2007). However, it is possible to identify such fabrics through chemical and burning tests.

The discovery that hairs and fibres could indeed be useful in crime investigation came in the 19th Century. It was based on the premise that any struggle between a victim and attacker led to the transfer of hairs and fabric between them (Saferstein, 2000). The early 20th century saw the establishment of microscopic examination of hair. The development of this procedure has subsequently been used over the years to provide hints to crime investigators. Previous cases that have shown that hairs can be tools of crime investigation are well documented. A good example involves Mable Tattershaw who was strangled in 1951 (Saferstein, 2000).

A thorough examination on her clothes revealed some hair traces. The hair was subjected to a forensic laboratory examination only to reveal that it was identical to the head hair of Leonard Mills, the chief suspect. It is important to note that hair examination under a microscope can reveal too much information about a person. The race, sex and age of the owner can be ascertained (Ramstard, 2006).

The most striking feature, however, is the fact that hair cannot be destroyed apart from burning. Cloth fibres also find themselves attached to surfaces where crime must have been committed. Pieces of cloth that fit into each other may also be found. In most cases, tiny fibres are collected in crime scenes and what remains to be understood is how these fibres get attached on even smooth surfaces after a crime has been committed.

The collection of hairs and fibres is usually the first step in crime investigation (Ditton, 2002). It is when hairs and fibres have been collected that events can be reconstructed. Where and when the samples are collected is a great consideration made. The state of hair can reveal whether or not force was used in the crime. Availability of root structures is indicative of a forceful crime (Jackson, 2002). This factor can be used to characterise the nature of crime on the basis of venue or actions performed. Collection of hairs and fibres is achieved by use of a clear tape. The tape is applied on a surface and later removed carefully.

The use of various tapes is common depending on the stickiness desired. The efficiency of stickier tapes is high when recovering fibers. The fibers are normally extracted from the tape by use of a liquid (Ditton, 2002).

The identification of hairs and fibres is vital. It is important that the type is known long before the examination process begins. This helps to reduce cases of mismatch. Type of hair is not usually difficult to differentiate. Human hair, dog hair and cat hair are all different in color and texture. The presence of dog hair on a crime scene will automatically rule out the possibility of testing human hair in the laboratory. The identification of fibres is also vital in the crime investigation procedures.

As earlier stated, the identification of natural fibres is usually easy. Wool, for instance, can be easily differentiated from cotton by virtue of texture. It is however difficult to distinguish silk from polyester in cases where both have been combined to make a single fabric (Reffer & Welzel, 2007). Most synthetic fibres are produced through ester formation and cannot be easily differentiated on basis of observation under the microscope. Infrared spectrophotometry is usually used to make a distinct difference between synthetic fibres. The technology makes use of the absorption and reflective features of objects once subjected to a source of electromagnetic radiation (Kelly, 2007).

Absorption and reflection phenomena of the spectrum are used in the characterisation of fibres on the basis of how they absorb and reflect various parts of the visible spectrum. Absorption bands are usually formed on surfaces that appear dim when light is passed through surfaces. These bands form signatures that are easily detected by a spectrophotometer (Kelly, 2007). Fabrics may also absorb invisible wavelengths such as ultraviolet (UV) and infrared (IR) rays. The fact that infrared has a wider wave length compared to UV or visible band makes its application better since complete substance signatures is provided.

The analysis of hairs and fibres is usually a critical procedure that involves qualified forensic experts. Traditional analysis for hair that involves examination under the microscope is still very common today. Despite the fact that DNA analysis is still carried out, most forensic examinations employ the use of the traditional methods. DNA analysis has been associated with reduced levels of objectivity, reliability and high costs (Saferstein, 2000). Microscopic examination yields vital information regarding type of hair (animal or human), race, fall condition, species of animal, body part and how it was cut.

The value and direction of the DNA analysis is dependent on the preliminary examination carried out on the hair. The lack of root material for the hair means that no DNA analysis can proceed thereafter (Saferstein, 2000). Dry mounting of hair is usually done on a glass slide to allow comparative viewing under a microscope. A wax block provides a medium on which examination of a hair cross section is done. Microscopic view of the cross section of medulla is achieved. Cellulose acetate is important in the preparation of cuticular scales.

Several tests may be conducted by a scientist in regards to dyed hair. The success of a forensic laboratory examination depends largely on the work conducted on the crime scene (Saferstein, 2000). It is usually a natural advantage that hair has different loss and growing rates. Thorough examination of clothing can give traces of hair. The scientist may opt to change the natural appearance of the natural hair if he/she feels that matching can be achieved appropriately (Ramstard, 2006).

The scientist’s opinion of whether a match of sample hair and suspect’s hair is evident is enough proof that can be presented before a court of law. Murder investigations have been successful courtesy of hair examination. A good example that involved the buried remains of a murder victim is illustrated (Kelly, 2007). A body buried for five weeks revealed several outcomes regarding the identity of the victim, weapon of injury and the suspect. Laboratory examination of head hairs found on a branch purported to have been used to kill her was conclusive evidence.

Fibres that are examined and analysed by IR spectrophotometry involve a series of scientific steps. Sodium chloride finds its importance here. The mixing of the common salt and the fibre is done forming a disk (Reffer & Welzel, 2007). The fact that salt is transparent to IR rays make it more reliable for this exercise. The disk is then focused under IR light and observations made. The fibre absorbs parts of the IR radiation and reflects the rest. The chemicals present in the fibre are responsible for the absorption and reflective tendencies. The rays observed are characterised into a spectrum that gives differences in light intensities.

The varying light intensities are usually measured and plotted by the spectrophotometer. The device ensures electronic and graphical visualisation of the resultant wavelike characteristic output. The peaks and troughs are indicative of the absorption bands. A comparison of the signatures generated by these absorption bands is important for the scientist to correlate the characteristics observed to the substance. The quantity of compound present in a substance can easily be established (Reffer & Welzel, 2007). The origin of the fibres, concentration of fibre and the quality of fibre can be inferred from spectrophotometry (Reffer & Welzel, 2007).

A sample of fabric can generate questions such as; what are the weaving patterns? How are the edges of the cloth physically fitting to the fabric obtained from the crime scene? The answers to these questions can be obtained. The weaving patterns may be twill, plain, satin and pipe weave. The jig-saw fit of fabric elements may help in giving the assurance that the fitting parts are one. The burning tests and chemical tests can also be conducted by forensic scientists to identify and classify fibres.

Trace evidence examination is used to refer to forensic examination that involves the collection and analysis of samples such as hairs, fibres, paints and glass (Kelly & Phillip, 2004). It is important that the evidence remains undisturbed if objective conclusions are to be made (Jackson, 2002). Any form of interference makes the evidence useless before a court of law. Cross-contamination of hair and fibre samples may occur between the point of crime and the forensic laboratory.

The police officer involved may pick up a victims hair and purport that the same was found on the suspect’s clothes. It would therefore be very difficult to present such evidence before a court of law. Utmost care is required when handling, packaging and transferring hair and fibre evidence (Phillip & Bowen, 2010). Clear labeling of the evidence is paramount if good results are to be expected. A mismatch of samples may ensue especially in cases where evidence is lost.

The chain of custody protocol is usually vital for trace evidence (Phillip & Bowen, 2010). The protocol ensures that evidence is handled professionally from the crime scene to the court room. An item should be clearly labeled in order to avert possibilities of ambiguity. Any police officer and forensic scientist involved must sign against the evidence. Any form of contamination renders the evidence inappropriate for prosecution by a court of law (Phillip & Bowen, 2010).

Despite the intricacies of hair and fibre evidence, the method has widely been used to unravel criminal mysteries that have remained concealed over time. The fact that hair cannot be destroyed makes it appropriate in handling crimes that were committed several years ago. The method also enjoys a range of advantages such as; it provides detailed information such as age, race and type of hair. The application of this forensic procedure can therefore be of importance if due care is taken. The effective incrimination of a suspect can be ensured if all the norms of evidence packaging, handling and transport are adhered to.

References

Ditton, J. (2002). Human Hair Fibers. Journal of Forensic Science, 41 (1).

Jackson, A. (2002). Forensic Science and Crime investigations. Routledge Press.

Kelly, J. F. & Phillip, K. W. (2004). Tainting Evidence: Inside the Scandals at the FBI Crime Lab. New York: Free Press.

Kelly, P. (2007). Forensic Science Under Siege. New York Press : New York Press, USA.

Philip, J.H. & Bowen, J. K. (2010). Forensic Science and the Expert Witness. University of California, USA.

Ramstard, K. (2006). Microscopy of Hair. Forensic Science Communications Institute.

Reffer, J. & Welzel, D. (2007). Forensic Classification of Polyester Fibers by Infrared Recognition. Journal of Forensic Sciences, 45 (3).

Saferstein, R. (2000). Criminalistics: An Introduction to Forensic Science. New York: Prentice-Hall.

Analysing the Way Forensic Scientists Conduct Qualitative and Quantitative Analysis of Physical Evidence

The development of science and technology undoubtedly gives positive results in the disclosure and investigation of crimes. The importance of the use of material evidence in criminal proceedings and expert opinions as the most important objective sources for the creation of a reliable evidentiary base is steadily increasing. The need for the qualified, effective and timely use of forensic tools and methods today is an integral part of the investigation of murders, rapes, robberies, thefts, crimes related to drug trafficking and other crimes.

The Concept of Forensic Science

Forensic science is a kind of process of objective reality cognition carried out by applying the methods of various sciences. The study of the physical evidence is similar to scientific research; it differs by some features inherent in a practical activity that is scientifically based in the field of forensic science, using the provisions of some sciences, including legal, natural, technical and humanitarian ones. These provisions, scientific methods, and means are used in the work of forensic experts and expert institutions for solving practical problems of establishing the truth in the civil or criminal case or the case of an administrative offense (Strom & Hickman, 2014).

The methods of forensic expert practice are based on appropriate scientific methods, depend on the nature and properties of the object of research and are based on the experience of solving specific expert tasks, including algorithmic rules and those developed by the expert himself (Rivers & Dahlem, 2013).

Special Methods of Physical Evidence Analysis

In expert and preliminary studies of physical evidence, in addition to general scientific methods, special methods are also used. Special methods are based on the principle of generality and can, in turn, be divided into general expert used in most classes of forensic examinations and studies and private expert, used only in the given particular type (form) of judicial expertise and research. The system of general expert methods of investigating material evidence includes methods of image analysis; methods of morphological analysis; methods of composition analysis; methods for analyzing the structure; methods for studying physical, chemical and other properties.

Methods of image analysis are used to study traditional forensic objects – human traces, guns and tools, vehicles, as well as documents, etc. By morphology, the external structure of the object is meant, as well as the shape, dimensions and mutual arrangement of its constituent structural elements on the surface and in the volume arising under investigation. The most common methods of morphological analysis are the methods of optical microscopy – a set of methods of observation and research using an optical microscope, namely, ultraviolet and infrared microscopy, stereoscopic microscopy, comparative microscopes, transmission electron microscopy, scanning electron microscopy.

Element analysis methods are used to establish element composition, namely, the qualitative or quantitative content analysis of certain chemical elements in a given substance or material. Their range is quite wide, but the most common in expert practice are the following: emission spectral analysis, laser micron spectral analysis, X-ray spectral analysis.

The molecular composition of an object is understood as the qualitative (quantitative) content of simple and complex chemical substances in it, for the establishment of which molecular analysis methods are used. They are chemical analytical methods and micro crystalloscopy. Although, the main methods for studying the molecular composition of physical evidence at present are molecular spectroscopy and chromatography.

Metallographic and X-ray analysis are used to study the crystal structure of objects. With the help of metallographic analysis, changes in the macro- and microstructure of metals and alloys are studied in connection with changes in their chemical composition and processing conditions. X-ray diffraction analysis makes it possible to determine the orientation and dimensions of the crystals, the source of origin, or other details, to determine the causes of fire, explosion or road accident by destructions.

Qualitative and Quantitative Analysis of Physical Evidence

An integral part of forensic science is analytic chemistry. The main task of analytical chemistry is to determine the qualitative and quantitative composition of the substance and to identify whether a matter is organic or inorganic. The qualitative analysis consists of determining the chemical elemental composition of the substance, what chemical elements (or groups of elements) are exactly included (Bowen, 2016).

In forensic science, identification is the establishment of a substance (object) by a combination of general and particular characteristics. The quantitative analysis consists of determining the quantitative content of chemical elements or their groups in the analyzed substance. The tasks are solved by a method of influencing a substance with the subsequent registration of physicochemical or physical properties of a substance, which makes it possible to carry out qualitative, quantitative analysis, and to establish the structure of a substance or conduct its identification.

In the analysis of physical evidence, organic substances that are of biological origin and substances of inorganic origin are considered. As evidence of organic origin traces of blood, saliva, sweat, etc. are considered. Left at the crime scene substances of inorganic origin includes dust, paint, fibers, etc. To identifying physical evidence to ensure that accurate and reliable scientific results are obtained forensic scientist uses a combination of image analysis methods with chemical analytical methods and micro crystalloscopy. Forensic scientist defends his or her scientific conclusions and opinions in a court of law by providing evidence received legally and listed as case evidence. The forensic tools and methods are the basic of the crime investigations and the evidence base analysis that can be used by the prosecution.

References

Bowen, R. T. (2016). Ethics and the practice of forensic science. Boca Raton, Florida: CRC Press.

Rivers, D. B., & Dahlem, G. A. (2013). The science of forensic entomology. New York, USA: John Wiley & Sons.

Strom, K. J., & Hickman, M. J. (Eds.). (2014). Forensic science and the administration of justice: Critical issues and directions. Thousand Oaks, California: Sage Publications.

The Concept of Forensic Biometrics: Physiological and Behavioral Characteristics

Introduction

Biometrics is a concept which has attained significant popularity in recent years due to the prevalence and capabilities of technology. Biometrics is a form of access control which can accurately, efficiently, and uniquely identify humans. Biometrics can be identified as either physiological or behavioral characteristics (Saferstein and Roy 153). The concept of forensic biometrics seeks to use automated and human-based systems to identify, analyze, and interpret biometric data for forensic and investigatory activities.

Origin and Development

Considering that biometrics consist of several elements, including iris, fingerprint, and facial recognition, alongside behavioral characteristics – each has a unique history of origin and development. First attempts of using the iris for identification came in the 1950s by J.H. Doggart who emphasized that the iris is unique for each individual forming infinite patterns. By 1985, Dr. John Daugman created an algorithmic computer system which could analyze and verify the human iris, which became known as the IrisCode and is the foundation to most modern iris scanners. Similarly, technology has also focused on scanning the retina of the eye, which also has a unique pattern of veins and capillaries but remains less consistent and reliable than iris biometrics (Saferstein and Roy 158).

Similarly, facial recognition is a non-intrusive and efficient mechanism of biometrics. Facial recognitions amongst people in society has been used historically for centuries (wanted posters), but the modern technology of is attributed to Matthew Turk and Alex Pentland in the 1990s. They created the Eigenface technique which uses matrices of human faces to identify and authenticate human faces. It was an automated biometric solution which allowed the computer to effectively track faces from a camera without error or human intervention (Saferstein and Roy 161).

Fingerprints are the earliest known biometric system, with some accounts of its use as identifying marks dating back to 500BC in the Babylonian Empire. In 1896, Sir Edward Henry created the Henry Classification System which sorted fingerprints by physiological characteristics for quick searching and became the first official system of identification used by law enforcement, and later became the basis for modern databases (Lee). Currently law enforcement has huge databases, transitioning from IAFIS to NGI, which stores millions of fingerprints due to the Tenprint techniques in addition to the NPPS with palm prints. These databases can use complex algorithms to match up fingerprints of most citizens in matter of minutes (Saferstein and Roy 164).

Therefore, the biometric boom occurred in the 1990s when biometrics grew significantly as a field of research. In combination with the onset of the digital age, computerized algorithms and rapid automation could be combined with traditional techniques and databases that law enforcement agencies have been developing for decades. The gradual integration of biometrics into commercial technology has also significantly helped to boost the research and applications of forensic biometrics.

Physical Evidence

Biometric systems use biological traits (modalities) which are known in advance and used for person recognition. Recognition occurs in real time depending on the computational efficiency of biometric applications (Jain & Ross 4). Forensic biometrics relies on a mixture of physical and digital evidence. Some of forensic concepts developed for physical evidence can be applied to digital evidence, but others cannot be due to its nature. Biometric technologies can be utilized to process data from latent fingerprints and palmprints as well as written documents as purely physical evidence. Recovering latent fingerprints, including its age, can be very important, particularly when conventional methods may not work effectively on surfaces such as metal or in unfavorable conditions. Using tools such as high resolution optical capturing devices or electromagnetic spectrum tools (infrared to X-ray) can track potential biometrics even in a covert mode. These treatments meant for latent fingerprint visualization allow the physical evidence to be digitized and analyzed through the numerous databases to search for offenders (Tistarelli et al. 157).

Generally, the physical or biological evidence which may be potentially applicable in forensic biometric matching such as fingerprints, sources of DNA, or physical audio or visual recordings are collected, treated, and stored as they would with traditional forensic techniques. It is a well-developed process to effective identification, documentation, collection, and preservation which seeks to follow scientific analysis and maintain integrity of the evidence. In a forensic context, a sample obtained from the crime scene are referenced to existing samples and needs to be of sufficiently good quality for the biometric system to perform as intended. However, biometric traits on evidence need to be unique, distinctive, and robust to the forensic conditions. Therefore, quality of sample is reliant on integrity of evidence collection and environmental conditions at the scene (Saini and Kapoor, 3)

Tests

Although more common in military rather than law enforcement, one preliminary test for biometrics used is Sensitive Site Exploration (SSE). It can be defined as systematically searching for and collecting information and material from a designated location and analyzing them to answer information requires, facilitating subsequent operations or supporting criminal persecution (Blestrieri). Essentially, it is the use of special exploitation kits which can capture biometric data at the scene (retina, facial, DNA, and fingerprints) as well as analyze personal documents and communications, and provide a limited but efficient overview of available intelligence. The confirmatory test, such as for fingerprints, can considered AFIS searching. It is conducted in a secure and stable location, after the evidence was collected and processed. The fingerprint is submitted into the automated fingerprint identification system (AFIS) at which point algorithms automatically return of a number of most likely of potentially matches, at which point the human examiner has to confirm the match to avoid potential error (Kellman et al. 2).

Identification Processes and Applications

Biometrics at its core is a verification mechanism which is meant to identify the individual based on their physiological or behavioral traits. The biometrics expansions can be observed in various forensic identification parameters such as face, fingerprint, iris, voice, handwriting, and others. The biometrics system can be classified into two categories, identification and verification. In the identification mode, the biometric system attempts to identify the individual by search the individua by search the templates stored in the database, conducting a one to many comparisons in order to find an identity. In verification mode, the biometrics of an individual is compared to the biometric template stored in the system database, known as one-to-one comparison (Saini and Kapoor, 2).

As discussed earlier, each type of biometric is inherently unique to an individual, ranging from fingerprint patterns to iris capillaries to distinctions in voice and handwriting. Modern technology uses complex sensors and analysis to identify and track these individual traits in biometric characteristics and compare them to databases in order to present the closest matches. Since the system is yet imperfect, usually algorithms err on the side of caution and require subsequent confirmation of matches by human beings who evaluate the parameters and final results of the algorithms.

Forensic biometrics play a role in crime detection in a combination of applications. The techniques and modules of biometrics analyze evidence by overcoming human sensory, cognitive, or simply time/physical limitations; thus, increasing both the efficiency and the effectiveness of investigations using forensic biometrics. Second, the methods applied provide a scientific and data-oriented basis to criminal investigation procedures and investigation, also going beyond human capabilities by applying computer science, mathematics, and statistics to large data. Finally, the methods help to standardize the elements of evidence analysis and criminal identification, decreasing human biases or errors (Saini and Kapoor, 2).

Case Studies

One example of forensic biometrics being utilized was in the aftermath of the Boston Marathon bombing of 2013. At the behest of the FBI, tips were pouring in with photographs and videos. Each was meticulously analyzed, and face identifications were compared to databases of terrorists or persons of interest. However, the system at the time failed to recognize the terrorists and provide identification, the Tsamaev brothers, even though government agencies had their up-to-date photographs on hand and one was involved in a terrorism-related investigation, all of which was digitally recorded. However, the biometric system at the time failed to recognize them clear in several of the photographs due to low resolution of photos and one of the terrorists wearing sunglasses. It is a challenge that modern systems to this day struggle with, but with Next Generation Information System implementation, there is potential for improvement (Saferstein and Roy). As technology improves, both visual capturing (including on commercial devices) and digital analytics algorithms meant to search and compare databases, the biometric system can potentially experience greater success.

One noted benefit of the new technology and biometrics databases is that it allows law enforcement to revisit cold cases. Recently, a 30-year-old murder case was resolved using the FBI’s Integrated Automated Fingerprint Identification System (IAFIS). A 61-year old senior was brutally stabbed in 1978 in his apartment. After which the perpetrator stole the victim’s car and escaped. Police was able to collect evidence, including latent fingerprints and palmprints in the apartment, and later, in the found care, but no new leads could be found despite fingerprints being manually compared to local and state files. In 2008, after receiving an inquiry on the case, an office at the Omaha Police Department ran the latent fingerprints through the IAFIS. After investigating possible matches, she came up with a positive identification of a known felon currently serving prison time. An investigation was opened, placing the perpetrator in the vicinity of the crime, and subsequent DNA testing proved to be a match, allowing for prosecution of the dangerous criminal placing him in prison for life (“30-Year-Old-Murder Solved”). Repeatedly, the FBI notes that cold or contested cases can significantly benefit from modern biometric systems as technology allows for a much wider and efficient search of databases than ever before.

Conclusion

In modern society, the ability to identify individuals in real time, both reliably and effectively, is the foundation to various applications of biometrics in a highly networked world. Forensic biometrics seeks to take advantage of this, both in real-time and post event collection of evidence at a crime scene. The biometric modalities are algorithmically analyzed in complex databases which allow for rapid identification of individuals and offenders. As technology and networks continue to improve and become prominent in society, biometric data will become central to various applications, including forensic investigations.

Works Cited

.” FBI, 2020. Web.

Balestrieri, Steve. ” Spec Ops Forensics: Everything You Need To Know About Sensitive Site Exploitation (SSE).” SOFREP, 2019, Web.

Jain, Anil K., and Arun Ross. “Bridging the Gap: From Biometrics to Forensics.” Philosophical Transactions of the Royal Society B: Biological Sciences, vol. 370, no. 1674, 2015, p. 1-10.

Kellman, Philip J., et al. “Forensic Comparison and Matching of Fingerprints: Using Quantitative Image Measures for Estimating Error Rates through Understanding and Predicting Difficulty.” PLoS ONE, vol. 9, no. 5, 2014, p. 1-14.

Lee, Jason. “A Brief History of Biometrics.” Bioconnect Solutions, 2020, Web.

Saferstein, Richard, and Tiffany Roy. Criminalistics: An Introduction to Forensic Science 13th edition. Pearson, 2020.

Saini, Monika, and Anup Kumar Kapoor. “Biometrics in Forensic Identification: Applications and Challenges.” Journal of Forensic Medicine, vol. 1, no. 2, 2016, Web.

Tistarelli, Massimo, et al. “Biometrics in Forensic Science: Challenges, Lessons and New Technologies.” Biometric Authentication, 2014, pp. 153–164.