Inspection and Testing
Inspection and Testing
Inspection is an organized examination or formal evaluation exercise. In production, it involves the measurements, tests, measuring gauges and test equipments applied to certain characteristics in regard to a material or a product. The results are normally compared to the specified requirements and standards for determining whether the material or the product is in line with these targets. Some inspection and testing methods are destructive. However, inspections are normally non-destructive.
Inspection and testing activity is a systematic and independent assessment activity in the organization. This activity provides timely, credible, and useful information. The inspection and testing function of the organization is important for the success of the organization. Some organizations make a distinction between inspection and testing work. The organization needs to have written approved procedures for the inspecting and testing activities to ensure that the inspection and testing activity complies with the quality standards as well as the legislation, regulation, as well as the applicable standards.
Every organization is required to have inspection and testing procedures. The purpose of these procedures is to establish and define the process for inspection and testing and activities which verify the material or product conformance, and to verify that process inputs and outputs conform to specified requirements. Documented records and information of inspection and testing include evidence of conformity with the acceptance criteria and traceability to the person who has carried out inspection and testing of the material or product. Records of inspection and testing are to be maintained.
Inspection and testing at different production stages allows for early detection of faults. The earlier a defect is found in the production process, the less expensive it is to fix the defect. Defects are to be corrected so that yields at the inspection and testing stage can improve. Inspection system can be visual, or radiography based. Several inspection systems are automated. Automated inspection systems can be (i) sensor based, (ii) instrument based, or (iii) optical based.
Non-destructive examination (NDE) or non-destructive testing (NDT) uses a number of technologies which are used to analyze materials for either inherent flaws (such as fractures or cracks), or damage from use. Some common methods are visual, microscopy, liquid or dye penetrant inspection, magnetic particle inspection, eddy current testing, x-ray or radiographic testing, and ultrasonic testing.
The Inspection and testing personnel are required to have knowledge and skills for carrying out the inspection and testing of the materials. They are to be familiar with the requirements of the material specification. They are at times required to carry out the physical examination of material and witness measurements in the absence of relevant standards and related test procedure. In such cases, the directions / procedures given in the purchase order / contract specifications are to be followed. General guidelines from the procedures are also be followed which are helpful in carrying out inspections. Inspecting personnel are sometimes required to carry out the stage inspection or witness all types of tests or proto type test or any specific joint inspection with other agencies. This requires sufficient knowledge of the subject and help from the relevant standards / guidelines / procedures.
The factors which influence inspection and testing activities are (i) market for the products, (ii) knowledge and experience of personnel, (iii) materials, (iv) availability of funds, (v) support from senior management, (vi) measuring and test equipments, (vii) organizational culture and motivation of employees, (viii) effective communication, (ix) adoption of modern techniques, (x) technology of the production processes, and (xi) organizational discipline. Fig 1 shows activity map for inspection and testing.
Fig 1 Activity map for inspection and testing
For the inspection and testing personnel who are to perform the inspections and the tests, it is to be ensured that the individuals, who performs an inspection and testing activity to verify conformance of an item to specified acceptance criteria, are qualified. Further, it is to be ensured that the inspection and testing is performed by personnel other than those who performed or directly supervised the work being inspected and tested. It is also to be seen that the inspection and testing personnel do not report directly to the immediate supervisor responsible for the work being inspected and tested. The individuals who carry out the inspection and testing activity to verify conformance of the material or product to the specified acceptance criteria, are to be qualified in accordance with the approved quality control plan.
Inspection and testing are done as per a procedure outlining the methods established to inspect and test finished product, work-in-progress and / or raw materials to ensure they meet specification in relation to the specification requirements. Inspections, test, or analysis of finished product is to be finalized before delivery to a customer. Finished product testing can be defined by the supplier and their customer.
Procedures for inspection and testing activities are to include all the activities relating to the inspection and testing work. Inspection and testing personnel are to be free from personal, external, and organizational impairments to independence. Inspection and testing personnel and their department have a responsibility to maintain independence so that opinions, conclusions, judgments, and recommendations are impartial and are viewed as impartial by knowledgeable third parties. The independence criterion is to be applied to anyone in the organization who can directly influence the outcome of an inspection and testing activity.
Inspection and testing personnel are to be alert to possible impairments to independence and are to avoid such situations which can lead to reasonable doubt regarding their independence. Further, they are to adhere to the ‘standards for ethical conduct for employees of the organization’ and statutory regulations with respect to ‘conflict-of-interest’.
It is to be ensured that the measuring and test equipments are calibrated and are of the proper type, range, accuracy, and tolerance to accomplish the intended function. The control of measuring and test equipment is to ensure that (i) the measuring and test equipments are calibrated, adjusted, and maintained at prescribed intervals or before use, against reference calibration standards having traceability to nationally recognized standards and in case there is no nationally recognized standards or physical constants exist, and the basis for calibration is documented, (ii) the calibration standards have a greater accuracy than the required accuracy of the measuring and test equipment being calibrated, and (iii) the method and interval of calibration, for each equipment, are defined, based on the type of equipment, stability characteristics, required accuracy, intended use, and other conditions affecting measurement control, and in case for measuring and test equipment used in one-time-only applications, the calibration is performed both before and after use, when practicable, (iv) the calibration is performed when the accuracy of calibrated measuring and test equipment is suspect, (v) the calibrated measuring and test equipment is labelled, tagged, or suitably marked or documented to indicate a due date or interval of the next calibration and uniquely identified to provide traceability to its calibration data.
Terminology used for inspection and testing
Inspection – It is the activity such as witnessing the measurement, examination, testing or gauging of one or more characteristics of an entity and comparing the results with specified requirements in order to establish whether the results are in order and to establish whether conformity is achieved for each characteristic.
Accuracy – It is the degree of conformity of a measured or calculated value to its actual or specified value. Also, accuracy is the ‘correctness’ of a measurement, that is, how closely it matches the value being measured. The term ‘precision’ is not to be used for ‘accuracy’. As an example, if there are two screw gauges ‘A’ and ‘B’ used for measuring the diameter of a wire rod having diameter of 6.56678 millimetres (mm). Screw gauge ‘A’ can only display a reading of 6.56 mm, while screw gauge ‘B’ can display a reading of 6.49969 mm. Screw gauge ‘A’ is accurate (its measurement is more correct), but not precise while screw gauge ‘B’ is precise (more detailed), but not accurate.
Precision – Precision is a measure of how well and how detailed the result has been determined (without reference to a theoretical or true value), and the reproducibility or reliability of the result. The fineness of scale of a measuring device normally affects the consistency of repeated measurements and, hence, the precision. The ISO (International Organization for Standardization) has banned the term precision for describing scientific measuring instruments because of its several confusing everyday connotations.
Measurement – Measurement is a comparison to a standard.
Metrology – Metrology is the science and process of ensuring that a measurement meets specified degrees of both accuracy and precision. Metrology, then, is the process by which both accuracy and precision are tested and adjusted for (if necessary).
Standard – It is the testing of accuracy and precision of measuring device is made through a hierarchical system where each measuring device is compared against an external reference known as a ‘standard’. Each ‘standard’ is then tested against a higher level (more accurate and precise) ‘standard’, which is compared against an even higher ‘standard’, and so on.
Calibration – The process of comparison against a ‘standard’ and making the necessary adjustments is normally called calibration. Detailed records are maintained for each item which is calibrated to ensure ‘traceability’, and that the item meets clearly identified specifications for both accuracy and precision in all its operating parameters. Calibration refers to the process of setting the magnitude of the output (or response) of a measuring instrument to the magnitude of the input property or attribute within specified accuracy and precision.
Tolerance – Tolerance is an allowance, given as a permissible range, in the nominal dimension or value specification of a manufactured item. The purpose of a tolerance is to specify the allowed leeway for imperfections in the production of the part or component. The tolerance can be specified as a factor or percentage of the nominal value, or a maximum deviation from a nominal value, or an explicit range of allowed values, or be specified by a note or published standard with this information, or be implied by the numeric accuracy of the nominal value. Tolerance can be symmetrical, as in 40 mm +/- 0.1 mm, or asymmetrical, such as 40 mm + 0.2 mm / -0.1 mm.
Uncertainty (of measurement) – It is the result of the evaluation aimed at characterizing the range within which the true value of a measurement and is estimated to lie, normally with a given likelihood.
Measuring equipment – It consists of all of the measuring instruments, measurement standards, reference materials, auxiliary apparatus and instructions which are necessary to carry out a measurement. The term includes measuring instruments used in the course of inspection and testing, as well as those used in calibration.
Error (of measurement) – It is the result of a measurement minus a true value of the measurement. It can also be expressed as a percentage.
Limits of permissible error (of a measuring equipment) – The extreme values of an error permitted by specifications, and regulations etc. for a given measuring instrument.
Grade – Category or rank given to entities having the same functional use but different requirements for quality. Grade reflects a planned or recognized difference in requirement for quality. The emphasis is on the functional use and cost relationship. A high-grade entity can be of unsatisfactory quality and vice versa. Where grade is denoted numerically, the highest grade is normally designated as 1, with lower grades extending to 2, 3, 4, etc. Where grade is denoted by a point score, such as a number of star symbols, the lowest grade normally has the least points or stars.
Compatibility – It is the ability of entities to be used together under specific conditions to fulfill relevant requirements.
Conformity – It is the fulfillment of specified requirements.
Validation – It is the confirmation by examination and provision of objective evidence which the particular requirements for a specific intended use are fulfilled. It is also the process of establishing truth or soundness.
Accreditation – Accreditation is the process by which a facility becomes officially certified as providing services of a specified good quality so that the public can trust the quality of its service.
Traceability – Traceability refers to the completeness of the information about every step in a process chain. The term traceability is for example used to refer to an unbroken chain of measurements relating an instrument’s measurements to a known standard. As defined by NIST (The National Institute of Standards and Technology, USA), ‘Traceability requires the establishment of an unbroken chain of comparisons to stated references. It is also defined as ‘The property of the result of a measurement whereby it can be related to appropriate standards, generally national or international standards, through unbroken chain of comparison’.
Stages of inspection and testing
Normally there are three stages of inspection and testing which are (i) in coming material inspection and testing, (ii) in process material inspection and testing at each and every stage of intermediate storage of semi-product, and (iii) outgoing or final inspection and testing of the finished product. Quality control personnel carry out inspection and testing of each material for the parameters of its specification / standard. Hence, the quality control personnel who inspect the material are responsible for any discrepancies in the particular parameter of the material. The place of inspection largely depends upon the manufacturing conditions, circumstances, and plant layout. Normally, three types of location are permitted for inspection and testing. They are (i) floor, (ii) centralized, and (iii) separate room.
Floor inspection and testing can be done at the shop floor / at the equipment itself. In continuous production processes where every operation is linked through conveyors, it is not advised to carry product at a separate place for inspection and testing. The advantages of shop floor inspection and testing are (i) it saves the transportation of material to the laboratory, (ii) it provides quick inspection and testing, and (iii) it is best suited for heavy and bulky products.
In case of centralized / separate inspection testing laboratory, the materials / products / samples are brought to a separate inspection and testing laboratory or centrally located laboratory for inspection and testing. The advantage of this practice is (i) inspection conditions are better since precision instruments can be used, (ii) more accurate results because of the use of precision instruments, and (iii) less chances of inspector being influenced.
Inspection and testing can be remedial inspection and testing or preventive inspection and testing. Major difference between the two is that the latter attempts at prevention, while the former on cure. Preventive inspection and testing lays emphasis on removing assignable variables by paying special attention to the possibility of defects and waste is eliminated to the maximum possible extent. Preventive inspection and testing is also known as constructive inspection and testing and has hence the positive approach rather than the negative approach involved in remedial or corrective inspection and testing. Remedial inspection and testing detect parts which are defective, and hence it tries to discover defects which have already occurred. It tries to filter the good from the bad ones.
Operative / stage inspection and testing are also known as key point inspection and testing. This inspection and testing take place at each stage or at end of some functional operations, and hence, this inspection and testing automatically fixes the responsibility of the operator or operation which caused the defect. It almost eliminates the need of final inspection and testing and the defective piece is nipped in the bud, thereby eliminating the further wastage and the cost involved.
During the incoming or receiving inspection and testing, inspection and testing are concerned with the control of a quality of the raw material and purchased parts. It examines everything coming into the plant e.g., materials, parts, assemblies, and equipments etc. The received material is normally checked for (i) requirement laid down in purchase order / contract, (ii) damages, corrosion, and cracks etc. and (iii) suppliers’ test reports. In case of necessity, the quality control personnel inspect test the materials at the supplier’s plant, before its delivery or even when it is in the process of productions.
The in-process Inspection and testing examines the parts and semi-finished products in the plant at any stage of production process. It is mainly used as tool to anticipate and prevent subsequent production difficulties. The objectives of this type of inspections are (i) prevention of unnecessary hard work on the assembly floor, (ii) prevention of waste of large quantity of material by inspecting and testing mass production operations in the beginning as well as subsequent operations., (iii) prevention of rework on spoiled semi-products, and (iv) to ensure against loss of semi-products while in transit from one process to another.
In-process inspection and testing are conducted to ensure that the product or process conforms to specified requirements. The inspections and tests are carried out by quality control personnel. All intermediate products are inspected and tested as per requirement of the quality plan and are controlled by documented procedures. The intermediate product is not released for further processing until the required inspection and testing have been completed and cleared for further processing.
In the final inspection and testing, the finished product is checked for appearance, and specified parameters through the visual examinations and conducting of the appropriate tests or stores. It is a sort of centralized inspection and makes use of special testing instruments. The types of inspection and testing which are conducted on finished product are to be determined by the finished product specification.
In case of final inspection and testing, it is to be ensured that the finished products are inspected and tested for completeness, markings, calibration, adjustments, protection from damage, or other characteristics, as needed to verify the quality and conformance of the item to the specified requirements. The final inspection and testing are to include a review of the results and resolution of non-conformances identified by earlier inspections. If modifications, repairs, or replacements of items are performed subsequent to the final inspection and testing, then appropriate re-tests or re-inspections are to be performed.
The acceptance of an item is to documented and approved by qualified and authorized personnel. The final inspection and testing documentation is to include (i) the item inspected and tested, date of inspection and testing, the name of the quality control person, or the quality control person’s unique identifier, who documented, evaluated, and determined acceptability, (ii) the name of the data recorder, as applicable, and the type of observation or method of inspection and testing, (iii) the inspection and testing criteria, sampling plan, or reference documents used to determine acceptance, (iv) results indicating acceptability of characteristics inspected and tested, (v) measuring and test equipment used during the inspection and testing, including the identification number and the most recent calibration date, and (vi) reference to information on actions taken in connection with non-conformance.
The inspection and testing control activities are to be conducted and documented in accordance with the approved quality control plan. The planning for inspection and testing include (i) identification of documents to be developed to control and perform inspection and testing, (ii) identification of items to be inspected and tested, inspection and testing requirements, and acceptance limits, including required levels of precision and accuracy, (iii) identification of inspection and testing methods to be employed and instructions for performing the inspection and testing, (iv) identification of inspection and testing pre-requisites addressing, calibration for instrumentation, adequacy of testing equipment and instrumentation, qualifications of personnel, condition of testing equipment and the item to be tested, suitably controlled environmental conditions, and provisions for data acquisition, (v) identification of mandatory hold points and methods to record data and result, and (vi) selection and identification of the measuring and test equipment to be used to perform the test to ensure that the equipment is of the proper type, range, accuracy, and tolerance to accomplish the intended function.
The inspection and testing is to be performed in accordance with the quality control procedures, and, as applicable, and is to include (i) provisions for determining when a test is needed, describing how tests are performed, and ensuring that testing is conducted by trained and appropriately qualified personnel, (ii) test objectives and provisions for ensuring that prerequisites for the given test have been met, adequate calibrated instrumentation is available and used, necessary monitoring is performed, and suitable environmental conditions are maintained, (iii) test requirements and acceptance criteria provided or approved by the department responsible for the item to be tested, (iv) test requirements and acceptance criteria based on specified requirements contained in applicable design or other pertinent technical documents, and (v) potential sources of uncertainty and error. Other testing documents (e.g., national or international specifications, vendor manuals, or other related documents containing acceptance criteria) can be used instead of preparing special test procedures. If the other documents are used, then the information is to be incorporated directly into the approved test procedure, or incorporated by reference in the approved test procedure.
The test results are documented and their conformance with acceptance criteria evaluated by a qualified individual within the organization, to ensure that the test requirements have been satisfied. The test documentation includes (i) item or work product tested, date of test, names of tester and data recorders, type of observation, and method of testing, (ii) test criteria or reference documents used to determine acceptance, (iii) results and acceptability of the test, (iv) actions taken in connection with any non-conformances noted, (v) the individual evaluating the test results, and (vi) measuring and test equipment used during the test, including the identification number and the most recent calibration date.
Planning of inspection
During the planning for inspection the important things which are to be considered are (i) place of inspection, (ii) time of inspection, (iii) method of inspection, (iv) degree of inspection, (v) parameters of inspection, (vi) implements to be used for inspection, and (vii) qualification and experience of the quality control person to carry out the inspection.
The first thing of the inspection activity is to decide what parameters are to be checked or inspected during the inspection. The parameter can be different for different kinds of materials. So, the quality control person carrying out the inspection is to know clearly about the parameters which are to be checked e.g., diameter, and length etc. These make the variation to be studied in case of statistical analysis.
For the time of inspection of the product / material, there is no hard and fast rule. The practices which are normally followed are (i) inspection is done at each halt during production process, and (ii) inspection is done after each operation in the process. These practices help in fixing the responsibility for any defective work. This also helps in knowing where the quality is respectively not being followed.
The inspection activities are to be documented and controlled by written and approved instructions, procedures, drawings, check-lists, or other appropriate means. The documents for inspection include (i) identification of each work operation where inspection is necessary to ensure quality, (ii) identification of documents which are to be used to perform the inspections, and (iii) identification of the characteristics for inspection and the identification of when, during the work process, inspections are to be performed for those characteristics, (iv) identification of inspection or process-monitoring methods employed, (v) sufficient information from the final inspection, to provide a conclusion regarding conformance of the item to specified requirements, (vi) identification of the functional-qualification level (category or class) of personnel performing inspections, (vii) identification of acceptance criteria, (viii) identification of sampling requirements, (ix) methods to record inspection results, (x) selection and identification of the ‘measuring and test equipment’ to be used to perform the inspection.
Sampling in quality control is used for quality assurance. Quality assurance which relies primarily on inspection after production is called acceptance sampling. Acceptance sampling involves testing a random sample of items from a lot and deciding whether to accept or reject the entire lot based on the quality of the random sample. Quality assurance efforts which occur during production include statistical process control. Statistical process control involves testing a random sample of output from a process to determine whether the process is producing items within a pre-determined acceptable range.
Sampling is done by quality control personnel. The skills and knowledge of the quality control personnel are to be sufficient to provide a sound basis for their work, and to adopt an informed approach to applying the required quality sampling procedures. Further, quality control personnel are to be familiar with the safety precautions needed when working with the manufacturing equipment and when taking and testing the required samples. They are required to demonstrate safe working practices throughout the process of sampling. The process of sampling for confirming quality of materials under production involves (i) gathering samples at the appropriate level of frequency, and (ii) preparing the samples for inspection and testing.
Sampling is perhaps the most important step in the quality assessment of the material being inspected. Since a sample is just a small portion of the total material, the importance that the sample is representative of the material being inspected cannot be over-emphasized. Any test performed on the sample, regardless of how carefully and accurately performed, is worthless unless the sample is truly representative of the material being inspected.
Acceptance sampling is a statistical measure used in quality control. It allows the organization to determine the quality of a batch of products by selecting a specified number for testing. The quality of this designated sample is viewed as the quality level for the entire group of products.
Visual inspection and testing
Visual inspection – It provides a means of detecting and examining a variety of surface flaws, such as corrosion, contamination, surface finish, and surface discontinuities on joints (for example, welds, seals, and solder connections). Visual inspection is also the most widely used method for detecting and examining surface cracks which are particularly important because of their relationship to structural failure mechanisms. Even when other inspection techniques are used to detect surface cracks, visual inspection frequently provides a useful supplement. For example, when the eddy current examination of process tubing is performed, visual inspection is frequently performed to verify and more closely examine the surface disturbance. In some cases, acid etching (macro-etching) can be used to reveal structures which are not visible to the naked eye, such as flow lines in Fig 2. The figure shows flow lines in closed die forged AISI 4140 (UNS G41400) alloy steel grade steering knuckle revealed by cold deep acid etching with 10 % aqueous HNO3 (0.5 ×) and enhanced with inking.
Fig 2 Flow lines revealed by cold deep acid etching
Given the wide variety of surface flaws which can be detected by visual examination, the use of visual inspection can encompass different techniques, depending on the product and the type of surface flaw being monitored. The methods of visual inspection involve a wide variety of equipment, ranging from examination with the naked eye to the use of interference microscopes for measuring the depth of scratches in the finish of finely polished or lapped surfaces.
Coordinate measuring machines (CMMs) – These machines are used to inspect the dimensions of a finished product. CMMs consist of the machine itself and its probes and moving arms for providing measurement input, a computer for making rapid calculations and comparisons based on the measurement input, and the computer software which controls the entire system. Fig 3 shows an example of a CMM probe taking measurements on a machined stiffener. Coordinate measuring machines are primarily characterized by their flexibility, being able to make several measurements without adding or changing tools.
Fig 3 Measurement with CMM probe
Historically, traditional measuring devices and CMMs have been largely used to collect inspection data on which to make the decision to accept or reject parts. Although CMMs continue to play this role, producers are placing new emphasis on using CMMs to capture data from several sources and bringing them together centrally where they can be used to control the production process more effectively and preventing defective products from being produced. In addition, CMMs are also being used in entirely new applications, for example, reverse engineering and computer-aided design, and manufacture (CAD / CAM) applications as well as innovative approaches to production, such as the flexible manufacturing systems, manufacturing cells, machining centres, and flexible transfer lines.
Machine vision – Machine vision has emerged as an important new technique for industrial inspection and quality control in the early 1980s. When properly applied, machine vision can provide accurate and inexpensive inspection of workpieces, hence dramatically increasing product quality. Machine vision is also used as an in-process gauging tool for controlling the process and correcting trends which can lead to the production of defective parts. Some of the industries such as the automotive and electronics industries make heavy use of machine vision for automated high volume, labour intensive, and repetitive inspection operations. This ability to acquire an image, analyze it, and then make an appropriate decision is very useful in inspection and quality control applications.
Machine vision enables it to be used for a variety of functions, including (i)identification of shapes, (ii) measurement of distances and ranges, (iii) gauging of sizes and dimensions, (iv) determining orientation of parts, (v) quantifying motion, and (vi) detecting surface shading. These capabilities allow users to employ machine vision systems for cost-effective and reliable 100 % inspection of workpieces.
Hardness testing – Hardness testing is one of the simplest and most widely used inspection methods. It is a non-destructive method which can be used to predict the strength of the metals. Fig 4 shows the correlation between tensile strength and hardness for steels, brass, and nodular cast iron. All the heat-treated steels are subjected to hardness testing to verify that the heat treatment produced the correct hardness and hence strength.
Fig 4 Comparison of hardness with tensile strength
The most common types of hardness tests are indentation methods. These tests use a variety of indentation loads ranging from 1 gf (gram-force, micro-indentation) to 3,000 kgf (kilogram-force, Brinell). Low and high-powered microscopes (Brinell, Vickers, and micro-indentation) are used to measure the resulting indentation diagonals from which a hardness number is calculated using a formula. In the Rockwell test, the depth of indentation is measured and converted to a hardness number, which is inversely related to the depth.
Normally, the scale to use for a specified material is indicated on the engineering design drawings or in the test specifications. However, at times the scale is to be determined and selected to suit a given set of circumstances. Hardness testing has several applications in quality control, materials evaluation, and the prediction of properties. Since hardness testing is non-destructive and quick, it is a very useful tool for the production and process control. For example, the most common application of the Rockwell test is testing steels which have been hardened and tempered. If a hardened and quenched steel piece is tempered by reheating at a controlled and relatively low temperature and then cooled at a control rate and time, it is possible to produce a wide range of desired hardness levels. By using a hardness test to monitor the end results, the operator is able to determine and control the ideal temperatures and times so that a specified hardness can be achieved.
When large populations of materials make testing each workpiece impractical and a tighter control is needed for a product, statistical process control (SPC) is normally incorporated. This means of statistical control can enable continual product production with minimum testing and a high level of quality. Since several hardness tests are done rapidly, they are well suited for use with SPC techniques. However, proper testing procedures are to be followed to ensure the high degree of accuracy necessary when using SPC.
Tensile testing – The tensile test is the most common test used to evaluate the mechanical properties of the materials. Tensile testing is normally conducted by the material producer and the results are supplied to the user as part of the material certification sheet. Since the tensile test is a destructive test, it is not performed directly on the supplied material. For wrought materials, the test samples are taken from the same heat or lot of material which is supplied. In the case of castings, separate test bars are cast at the same time as the part casting and from the same material used to pour the part casting.
Although the tensile test is not normally conducted by the user of the metal product, it is important for the user to understand the test and its results. Unless the material specification needs an elevated temperature test, the tensile test is normally conducted at room temperature. Typical values reported on the material certification include the yield strength, the ultimate tensile strength, and the percent elongation. Since the modulus of elasticity is a structure insensitive property and not affected by processing, it is normally not required.
The main advantages of the tensile test are (i) the stress state is well established, (ii) the test has been carefully standardized, and (iii) the test is relatively easy and inexpensive to perform. The tensile properties of a material are determined by applying a tension load to a sample and measuring the elongation or extension in a load frame. The load can be converted to the stress by dividing the load by the original cross-sectional area of the sample. The strain can be calculated by dividing the change in gauge length by the original gauge length.
The shape and magnitude of the stress-strain curve of a metal depends on its composition, heat treatment, prior history of plastic deformation, and the strain rate, temperature, and state of stress imposed during the testing. The parameters used to describe the stress-strain curve of a metal are, the tensile strength, yield strength or yield point, percent elongation, and percent reduction in area. The first two are strength parameters and the last two are indications of ductility. The yield strength (YS) is the stress needed to produce a small specified quantity of plastic deformation. The normal definition of this property is the offset yield strength determined by the stress corresponding to the intersection of the stress-strain curve offset by a specified strain. For metals without a definite yield point, the yield strength is determined by drawing a straight line parallel to the initial straight-line portion of the stress-strain curve. The line is normally offset by a strain of 0.2 % (0.002). The ultimate tensile strength (UTS) is the maximum stress which occurs during the test.
Although the tensile strength is the value most frequently listed from the results of tensile testing, it is not normally the value which is used in design. Static design of ductile metals is normally based on the yield strength, since most designs do not allow any plastic deformation. However, for brittle metals which do not display any appreciable plastic deformation, tensile strength is a valid design criterion.
Measures of ductility which are obtained from the tension test are the strain at fracture and the reduction of area at fracture. Both are normally expressed as percentages, with the strain at failure frequently reported as the percent elongation.
Chemical analysis – The overall chemical composition of metals and alloys is most commonly determined by x-ray fluorescence (XRF) and optical emission spectroscopy (OES). While these methods work well for most elements, they are not useful for dissolved gases and some non-metallic elements which can be present in metals as alloying or impurity elements. High temperature combustion and inert gas fusion methods are typically used to analyze for dissolved gases (oxygen, nitrogen, hydrogen) and, in some cases, carbon and sulphur in metals.
A number of methods can be used to obtain information about the chemistry of the first one to several atomic layers of samples of metals, as well as of other materials, such as semi-conductors and various types of thin films. Of these methods, the scanning Auger microprobe (SAM) is the most widely used.
Metallography – Metallography is the scientific discipline of examining and determining the constitution and the underlying structure of the constituents in metals and alloys. The objective of metallography is to accurately reveal material structure at the surface of a sample and / or from a cross-section of sample. For example, cross-sections cut from a component or sample can be macroscopically examined by light illumination in order to reveal different important macrostructural features (on the order of 1 mm to 1 m), such as (i) flow lines in wrought products, (ii) solidification structures in cast products, (iii) weld characteristics, including depth of penetration, (iv) fusion zone size and number of passes, (v) size of heat affected zone, and type and density of weld imperfections, (vi) normal size and distribution of large inclusions and stringers, (vii) fabrication imperfections, such as laps, cold welds, folds, and seams, in wrought products, (viii) gas and shrinkage porosity in cast products, and (ix) depth and uniformity of a hardened layer in a case hardened product.
Macroscopic examination of a component surface is also necessary in evaluating the condition of a material or the cause of failure. This can include (i) characterization of the macrostructural features of fracture surfaces to identify fracture initiation site and changes in crack propagation process, (ii) estimations of surface roughness, grinding patterns, and honing angles, and (iii) evaluation of coating integrity and uniformity.