Metal Alloy Analysis - Accurate, Reliable and Finally Rugged
Nate Newbury, Regional Sales Representative
Rigaku Analytical Devices
They say, “don’t fix what isn’t broken,” but what if the status quo gets the job done, but leaves a lot of room for improvement? The truth is, legacy technologies for metal alloy analysis have hampered profitability and product quality while also leading to more headaches and an overall lack of operational efficiency. These outdated modes of metal alloy analysis also create significant challenges for the fast, accurate metal grade identification needed by industries today.
In reality, many industries involved in the metals lifecycle struggle with incorrect identification of metal alloys. This misidentification can lead to slim profit margins, stalled revenue growth, and inefficient operations. Though a Material Test Report (MTR) can usually be used to help identify metal alloys, MTR information is not always available, or cannot be associated with the correct batch of material. This confusion causes mix-ups or mislabeling, which in turn leads to errors, inefficiencies, and increased costs.
Numerous industries require accurate metal alloy verification after a product or component is installed, especially when safety is mission-critical. Additional testing and confirmation are often required if the metal alloy materials being utilized have not correctly verified, or have been in place for a period of time, in order to ensure that the specific elemental composition has not changed due to industrial processes or environmental factors. If the metal alloy materials they work with are not correctly verified, manufacturers or suppliers put themselves and their reputations at risk due to regulatory violations or inadequate workplace safety measures. As a result, it is crucial to be able to quickly and accurately identify and classify the elemental composition of all metal-based materials, products, or components used in the production cycle.
As an experienced metallurgist working in the field with clients, I have worked diligently to modernize metal analysis solutions. Committed to helping the metal industry test metals faster and more safely with handheld Laser-induced breakdown spectroscopy (LIBS) technology, I have been instrumental in guiding Rigaku’s development of its metals inspection and testing products and services.
Laser-induced breakdown spectroscopy is a form of atomic emission spectroscopy typically used for metal allow identification. LIBS uses highly focused, energetic laser pulses on sample surfaces to convert small masses of samples into plasma. These laser pulses cause the atoms and ions to emit their characteristic emission lines, allowing for fast and straightforward identification. Already in use for almost 20 years in laboratory and research environments, LIBS is better suited for the analysis of light elements such as lithium (Li), aluminum (Al), magnesium (Mg), and beryllium (Be) than handheld X-ray fluorescence (XRF). As a result, LIBS is the method of choice to identify the most difficult alloy grades, including aluminum grades.
Typical handheld XRF units are delicate and expensive, with components placed within millimeters of the metal alloy they analyze. This leaves them susceptible to frequent damage, especially if the unit is dropped. LIBS provides several benefits for metal alloy analysis. More rugged than other handheld technologies, like XRF, LIBS can be used in the harshest of environments, leading to a lower cost of ownership over the life of the product. Because LIBS also utilizes a laser, radiation licensing is not required, as it is with a handheld XRF that utilizes an open beam X-ray source. Finally, LIBS is recognized by the American Petroleum Institute (API) as an acceptable technology used in the Recommended Practice 578 (API RP 578), 3rd edition, which provides guidelines for quality assurance in the oil and gas industry.
The Rigaku KT-100S handheld LIBS metal analyzer can be used for the detection and quantification of elemental composition for both heavy and light elements, providing alloy grade identification in seconds. The analysis allows for minimal sample preparation with the possibility of in situ or remote analysis.
The KT-100S is also specifically designed for rugged environments and the harsh reality of handheld testing and has successfully passed rigorous durability tests. To guarantee protection against all work environments, the KT-100S underwent strict testing to the United States Military Standard MIL-STD-810G. During the testing process, the KT-100S was exposed to rigorous vibration, shock, and drop testing to evaluate its durability and performance when exposed to environmental stress. In addition, its IP-54 rating and safety window protect against dusty and wet environments. As the first handheld analyzer to have passed these tests, it is truly optimized for rugged use.
The KT-100S changes the cost-of-ownership paradigm. For starters, the KT-100S requires minimal maintenance and is typically rated at several thousands of hours of use (about five years). Additionally, the KT-100S does not require any recalibration other than periodic Al 7075/SS 316 system check samples (though a certification or full recalibration recertification process is always available if needed). What do these specifications mean for the typical metal alloy user? Assuming normal wear and tear, repair costs for the KT-100S over 5 years are significantly less when compared to traditional handheld XRF.