INSTRUMENTATION TECHNICAL CONTENT

Heat Treat Radio #131: Beyond Calibration — Real-World Accuracy in Heat Treat Measurement


What does it really take to achieve accurate temperature measurement in the real heat treat production? In this episode of Heat Treat Radio, host Heather Falcone sits down with Dr. Steve Offley, product marketing manager at PhoenixTM, to explore the science behind thru-process monitoring, thermal barriers, and data logger performance. From cold junction compensation to real-world shop floor challenges, they unpack why lab accuracy doesn’t always translate to production — and what heat treaters can do about it. Tune in to learn how to ensure your temperature data is as reliable as the parts you produce.

Below, you can watch the video, listen to the podcast by clicking on the audio play button, or read an edited transcript.




The following transcript has been edited for your reading enjoyment.

Introduction (00:04)

Heather Falcone: Today we are talking about a feature article coming up in this month’s magazine, Achieving Accurate Measurements in Real Heat Treat Production. Joining me today is author of this piece, Dr. Steve Offley from PhoenixTM, who is also our sponsor for today’s episode.

Steve is a product marketing manager at PhoenixTM with responsibility for both strategic product management and global marketing of the company’s thru-process temperature, optical profiling, and TUS system product range. Steve joined PhoenixTM in April of 2018 after 22 years of experience in the industrial temperature profiling market with another well-known company.

The Role of PhoenixTM (2:35)

Heather Falcone: Tell us about your role at PhoenixTM and the role of PhoenixTM, specifically how they provide solutions for the thermal processors out there.

Steve Offley: I am the product marketing manager or product manager for the range of temperature monitoring systems that we offer to the wider industrial space. We provide support for clients in a range of industries who are faced with the daily challenges of using thermal processing as part of their key manufacturing step. We offer unique solutions for those specific applications, because not every application is the same. Our goal is to allow the customer to monitor the temperature of their specific product in some form of heat treatment process.

For instance, we could be offering a solution for the coating market, where a client wants to monitor the thermal cure of a car body. They want to ensure that that car body, as it travels through the curing oven, is achieving the correct temperature, not just in the oven itself, but at the product level. So is each part of that car body achieving the right temperature for the right duration to cure the paint?

Another day we might be dealing with a food processor who, as you can imagine, when they’re dealing with food safety and HACCP requirements, they want to prove that the core of their product, which may be a chicken fillet in a deep fat fryer, is achieving the right temperature, to make sure that it’s safe and it’s an attractive product to eat. And of course, they want to be confident that the consumer is going to be healthy after consuming the product too.

In the buildings and ceramics industry, for instance, we can offer the same sort of solution for the manufacturers of bricks, building materials, tiles, etc., where the process may actually be up to three or four days long where they’re drying the products. But it’s still critical to know what the temperature is at the product level.

Much of our focus is on heat treatment of metals. We are trying to provide different solutions across the whole gamut of the heat treating industry — from primary production, such as slab heat treatment for steel and for aluminum, proving that the raw material has been processed correctly in the furnace, to the finished product.

We are talking about the formed metal product, making sure that that is achieving the right primary metallurgical properties. It needs to do the function it’s going to be used for, from a temperature profiling perspective and also possibly even a temperature uniformity survey (TUS), which is obviously critical in many of the automotive and aerospace sectors of the market where they’re trying to prove or validate the furnace performance.

What is Thru-Process Monitoring? (6:08)

Heather Falcone: Can you explain what thru-process is and how it influences the monitoring technology that you’re talking about?

Click the image above for an introduction to the thru-process concept and data logger basics.

Steve Offley: “Thru-processes” is the term that is key to the type of solution we are trying to provide. When we’re talking about heat treatment, there are still many applications where the product may be heat treated in a static box furnace, in which case the traditional technology of using trained thermocouples is probably as easy as any other, whereby you have your field test instrument external to the furnace chamber. The thermocouples are then moved into — or traced into — the furnace, attached either to the product or, if you’re doing a thermal uniformity survey, to a test frame to locate the thermocouples at the desired coordinates within the working zone of the furnace. Then you are collecting the data externally.

I believe it was two episodes ago that you had Dennis from ECM talking about modular heat treatment. He was talking about the challenges of or the increased level of technology associated with moving batches of products around the heat treating cycle in a modular approach.

When you have that type of setup, and even in situations where you may be heat treating in a continuous furnace, the use of a trailing thermocouple becomes difficult at best, impractical and problematic in terms of safety at worst. For the modular approach, you have thermocouples going into different chambers and moving around. There are seals and automated doors in which the thermocouples will be trapped. As such, it’s very difficult to actually monitor the whole sequence of events that may be occurring in the heat treatment sequence.

Traditional vs. Thru-Process Monitoring

This brings us back to the thru-process methodology. At Phoenix, we offer a system that is designed to travel as if it was part of the product basket through the process. The field test instrument, the data logger, has to travel with the product through the furnace.

A data logger in its own right is not capable of going through a furnace if you are measuring at 800°F to 1000°F. One of the key aspects of our system solution is what we refer to as the thermal barrier. It is an enclosure that is used to protect the data logger to allow it to travel through the process. Essentially you encase the data logger inside the barrier and then place it on the conveyor or in the product basket with short thermocouples that are then rooted to the product or to the test frame that’s being moved with the whole monitoring system through the process.

The Importance of Thermal Barrier Design (9:38)

Heather Falcone: The thermal barrier design is really important then, because you’re going to see a variety of environments. How do you protect the data logger?

Steve Offley: That’s the crux of the technology that we’re trying to provide, in so much that there are many different forms of heat treatment or many different forms of thermal processing where we’re trying to provide the protection we need.

You may have, for instance, a low pressure carburizing process where you’re putting the system into a vacuum furnace, and then you may have a high pressure quench at the end. You have to protect the logger and not just from the temperature criteria inside the furnace, but certain things like pressure changes, which can distort the equipment. That is one design barrier, which would give additional protection to prevent any distortion or compressional damage to the barrier.

Click on the image above to find a real-world companion to Dr. Offley’s barrier design examples, covering oil and water quench protection in practice.

There may be some circumstances, like with a T6 aluminum process, where you have sent the system through the furnace, you then got a water quench, and now in the thru-process principle, the equipment has to go through all aspects of the process. You therefore have to have a design in which the system can tolerate both the heating process, but also the rapid cooling going into the water.

You may also have a situation where you have an Endothermic carburizing furnace with an integrated oil quench. The same approach applies. You are going from a hot environment and then rapid cooling. You are not only protecting the logger from the damage of the heat, but also the materials, like the oil or water in the water sequential and oil quench scenario, so as to not damage the fairly sophisticated electronics of the data log.

There is a lot of science and technology involved in designing unique solutions to meet the specific requirements of the applications. In most cases, we are working with the client from their working spec to develop unique solutions that will meet their unique requirements.

Protection and Accuracy (13:36)

Heather Falcone: When we’re talking about actually monitoring the surveys, what special measures are you required to design the data loggers with that provide accuracy?  

Steve Offley: By the very nature that we are sending the data logger through the furnace, we have to be careful that we are not only protecting the data logger from physical damage, which is possible if we do not get the thermal barrier design correct. But we also want, at the end of the day, to guarantee that we are achieving the accurate data that we need to make sense of the profile information that we are getting. Because at the end of the process, you either have the thermal fingerprint of your process or if you’re doing temperature uniformity survey, you have the readability of the data at the respective test levels. According to the standard CQI-9 and AMS2750, the accuracy of the reading or the field test instrument has to be within ±1°F.

The purpose of the barrier is to not only protect the data logger from damage, but keep the data logger at a working temperature that allows the accuracy of the reading that conforms to the standards that you are working to.

There are different designs of thermal barriers that we can offer. The basic design is what we refer to as the microporous insulation technology. This is basically a dry barrier whereby the insulation slows down the penetration of the heat to the core of the barrier where the data logger is. But at the center of that barrier, there will be a device that we refer to as a heat sink. There’s a eutectic salt inside the heat sink, which will transfer its physical state from a solid to a liquid at a nominal temperature. It’s 58°C where the transfer occurs and that will maintain the temperature at that working temperature.

For longer processes, you may want to use a thermal barrier that uses what we call a phased evaporation protection methodology. In simple terms, it involves the use of water, which is able to absorb very large amounts of energy and heat, and obviously will boil at 212°F (100°C). While it’s maintaining that boiling state, it will maintain the temperature of the thermal barrier and the data logger inside it. So we can actually offer a high temperature data logger that is capable of operating safely at 212°F for long periods time and still be protected.

Thermocouple Use (17:00)

Steve Offley: As long as the barrier provides us with that thermal protection and the logger is working within its operating range, we are fairly safe. That being said, we have to be a little bit careful when we consider the technology of the thermocouple, because there’s some fairly serious restrictions on thermocouple use, which many people may or may not be aware of.

Many people know that the thermocouple technology was developed by Dr. Seebeck back in about 1821. He was a German physicist who discovered the fact that if you had two dissimilar metals connected at a junction or a point, at a particular temperature, those two dissimilar metals would create a millivolt reading, and that millivolt reading would be proportional to the actual temperature that those two dissimilar metals were experiencing. Hence the theory of the thermocouple.

Dr. Steve Offley showing the aluminal and the chromal leg of a type K thermocouple.

Most people are fully aware of what a thermocouple looks like, but it’s important to note that this is a type of thermocouple we’d use for a coating application. It has a PFA-insulated sleeving on it. You would not use this in many heat treatment applications, but what I want to do is to show you that in the core of the thermocouple, there are two wires, two dissimilar metals. This is a type K thermocouple. We have the aluminal and the chromal leg of the thermocouples. These are the unique materials that are used to generate the millivolt reading.

The way the thermocouple works then is that that millivolt can be cross-referenced to a calibration table or a voltage table to determine the temperature reading that the sensor. This is what we refer to as the hot junction, the very tip of the thermocouple. It’s critical that that point is where you want the measurement to be made.

What is often missed is the fact that with a thermocouple, although the hot junction is critical, there is another junction that is even possibly more critical and sometimes overlooked — the cold junction. The thermocouple does not actually record an absolute reading, it’s a ratio between the hot junction and the cold junction. The cold junction of a thermocouple is where the actual thermocouple materials, the two dissimilar metals, join what we refer to as the copper connection. This tends to be where, in the data logger or the field test instrument, the electronics make the physical measurement or process the actual reading from the thermocouple.

If you have a fixed data logger like we have at Phoenix, whereby you would designate the type of thermocouple you were plugging into the data logger — this is a data logger with 20 channels and it’s a type K — I would plug my thermocouple simply into that connector. The cold junction is not in this case at the point where I’m making the connection on the data logger. It is actually inside the logger because there is another wire that goes from the socket to the PCB board where the measurement is actually taken.

Inside the data logger, there is a connector block where the thermocouple wires from the thermocouple sockets will all join the PCB board where the measurement is taken. That’s the location where the cold junction measurement is taken. So, we have our hot junction at the end of the thermocouple, and we have our cold junction inside the data logger.

A side-by-side comparison of the two critical measurement points in any thermocouple circuit: hot junction vs. cold junction.

For some data loggers, that connector or that coal junction may actually be on the outside of the data logger, if it’s a universal connector. So it’s important that you understand where that cold junction is in-situ within your technology. The importance of the reading is the fact that you have a ratio between the hot junction where you are measuring the product and the cold junction where that physical measurement is being referenced inside the data logger.

You can imagine, therefore, if the data logger temperature changes, that change in data logger temperature can actually affect the reading you are taking inside your process. That is why it is important to either understand that and make changes so it’s prevented or do what we refer to as cold junction compensation.

Cold Junction Compensation (22:35)

Steve Offley: Inside the data logger, if you are going to compensate for that temperature difference, the data logger is protected up to a physical temperature. But the temperature is going to change. So that cold junction is going to change as it travels through the processing in the way that we do our measurements for thru-process monitoring. The logger will rise in temperature. Therefore, we have to compensate for that.

In the center of the data logger where the connection is made with the copper from the thermocouple cable to the copper-copper connection, we have a temperature sensor, a thermistor, which is accurate to 0.18°F. It measures the actual cold junction temperature of the logger, and it will then compensate automatically for that. Therefore, you can guarantee that even when your data logger temperature changes temperature, it’s compensating for that. There will be no drift in the measurement temperature that you are measuring at the hot junction at the product level.

Data logger temperature change over process time, with and without cold junction compensation, measuring a stable process temperature of 1470°F | Image Credit: PhoenixTM, taken from Achieving Accurate Measurements in Real Heat Treat Production in the March print edition of Heat Treat Today.

Heather Falcone: That was the first time that I had read about the cold junction compensation and why it’s so critical, especially when we’re doing TUS activities.

Steve Offley: With TUS, the accuracy of both the data or the field testing instrument, or the data logger and the thermocouple, are critical to the quality of the test data that you are collecting and obviously trying to comply with the very stringent requirements of the AMS and the CQI-9 standards.

In our case, where we are going through the furnace, we have a worst-case scenario because the data logger is naturally going to change in temperature. But even if we take the scenario to the shop floor, and we are doing an external temperature uniformity survey, the data logger that is sitting outside the furnace, cold junction compensation is still critical for that because within a working day, the floor temperature is generally going to be changing.

Events on the shop floor, like opening the furnace activity on the shop floor, are going to change that temperature. I’ve been in many plants where seasonal changes can make a significant difference to the temperature into which you are taking the temperature. It won’t be the first time I’m sure that people have taken equipment out a car after having traveled for many miles in the early hours of the morning only to realize that the data logger temperature may not be at room temperature. You have to be very careful that you have a stable piece of equipment and that the cold junction is working correctly.

It’s important to read the user manual because there’s often a very critical step to make sure that you are either calibrating the equipment in a real-life environment where the temperature change may be, or ensuring sure that your system has got cold junction compensation. Otherwise, what you believe is a true measurement and accurate, may be the calibration laboratory accuracy where the temperature is controlled very, very strictly. In a real life situation, you may not be seeing exactly the same results.

Heather Falcone: It is really important to consider because there are specific accuracy considerations for AMS2750 and CQI-9.

Steve Offley: I often make an analogy to racing. The Formula One racing cars are tuned up to perform highly on a racetrack environment. They can do 200 miles an hour, having been finely tuned, and they work well. If you take that same racing car off the road into the countryside, it is not going to be working quite as effectively.

The same can be said for data logger technology. A data logger that works well in a calibration laboratory and under fairly safe conditions may or may not be working as effectively on the shop floor, particularly when you consider the variation and the challenges that that environment will bring to a measuring system like this.

Linear Interpolation Correction Factor Method (27:22)

Heather Falcone: Can you explain how you use the linear interpolation correction factor method, because that’s one of the only that is allowed by AMS2750, and why it is beneficial to your data quality?

Steve Offley: We discussed the nominal requirements for the data log accuracy for its measurement, but for AMS2750, logger correction factors and also thermocouple correction factors can be applied to the test data that you are collecting with your monitoring system.

Firstly, for the data logger, we can create a data logger correction factor file, which basically shows the correction factors that need to be applied to each of the separate channels of the data logger for the data that you are collecting. Inside the data logger, we store the calibration information that was gleaned in the calibration laboratory. That can then generate an automatic calibration template, which can be automatically applied to each one of the channels on the data logger automatically as part of the test routine.

The last thing we want to do is to make some error by transferring raw data manually from a spreadsheet into a piece of software. So, the nice thing about that is that it’s automatically applying the pre-programmed offsets from the calibration routine in the laboratory itself.

Secondly, with the thermocouple, we can take a calibrated thermocouple where there will be a nominal reading at two ends of the thermocouple, and you then get the average correction factor. In some circumstances, people will apply a thermocouple correction factor of one nominal temperature below the test level that they are applying. At Phoenix, we calibrate the thermocouples across the complete temperature range of the data logger. Then, we apply what we call the linear interpretation method. What that means is that between each calibration point, we can calculate, using a linear regression line, the true correction factor at any temperature over the measurement range of the device itself.

The linear interpolation schematic showing thermocouple correction factors across the full calibration range | Image credit: PhoenixTM, taken from Achieving Accurate Measurements in Real Heat Treat Production in the March print edition of Heat Treat Today.

It cannot go beyond the bounds of the upper and lower limit, as extrapolation is not allowed as it says in the standard. But within the upper and lower bounds, we can interpolate linearly between each data point. There is a tight 140° between each point that we can then ensure that we are correcting for or playing the correct correction factor at each temperature from start to finish, not a nominal value over the whole range. In our view, that gives a far more accurate interpretation of the corrected data over the complete working range of the system as opposed to a single nominal value.

Final Thoughts (31:09)

Heather Falcone: We have talked about a variety of topics: thru-processing monitoring, thermal protection at the data logger, the benefits of making sure that you apply cold junction correction, and the specific accuracy considerations that we have to make sure we bundle in all together. What is the big takeaway you want to leave us with?

Steve Offley: Be careful you do not assume that the condition of operation in the calibration of laboratory is going to be reproduced on the shop floor because the conditions are very different. This comes back to the argument for the importance of cold junction compensation. If you are using technology or a data logger, check with the manual for what cold junction compensation should be applied and if there are any steps you need to make to ensure that that is applied correctly on the shop floor. If you do not, there is a high risk that what you think is accurate data may or may not be if you have a situation where your data logger temperature is varying with time, either in process or even on the shop floor with changing environmental conditions.

Heather Falcone: In the end, we want to make sure that we’re making good parts, and this sounds like a great system to make sure that you’re getting as accurate as possible.

Steve Offley: Quality data at the end of the day is essential for you to understand what your process is doing. It’s no good relying on data you cannot trust. Take that extra time to investigate and put steps in place to make sure that you are measuring what you think you are measuring at the hot junction and that the cold junction is being considered as part of that measurement process.​


About the Guest

Dr. Steve Offley
Product Marketing Manager
PhoenixTM Ltd.

Dr. Steve Offley, aka “Dr.O” is a product marketing manager with PhoenixTM Ltd. with 30 years of experience of temperature monitoring in the industrial thermal processing market.

For more information: Contact Steve Offley at steve.offley@phoenixtm.com.

Heat Treat Radio #131: Beyond Calibration — Real-World Accuracy in Heat Treat Measurement Read More »

Achieving Accurate Measurements in Real Heat Treat Production

Following tight standards might not mean your heat treat process is truly accurate…if your instrumentation does not see the full picture. In this Technical Tuesday installment, Dr. Steve Offley, product marketing manager with PhoenixTM Ltd., discusses how combining accurate data loggers, high-quality thermocouples, and linear interpolation of correction factors ensures consistent compliance with AMS2750H and delivers trustworthy survey results. The article further explores how thermocouple behavior and real-world processing conditions necessitate careful attention to each thermocouple junction.

This informative piece was first released in Heat Treat Today’s March 2026 Annual Aerospace Heat Treating print edition.


Introduction

In the world of heat treatment, temperature measurement accuracy is critical, whether performing process monitoring or temperature uniformity surveys (TUS) as part of AMS2750. Measurement accuracy is defined as the degree to which the result of a measurement, calculation, or specification conforms to the correct value or standard. Without confidence in the accuracy of your measurement, you are working in the dark and could be deceiving yourself and possibly others.

The requirement of ±0.1°F readability first referenced in the F revision of AMS2750 pyrometry standard has garnered much debate and critical discussions throughout the years. Although helpful in resolving confusion in what option to record and removing discrepancies in recording temperatures in metric versus Imperial units, this solution does not necessarily guarantee instrumentation accuracy working in a real-world heat treat operation settings.

The following article introduces important considerations to be made when performing the temperature monitoring operation with reference to measurement accuracy in a real-world test environment — on a shop floor, not just a stable controlled calibration laboratory.

Monitoring & TUS Methodology

Traditionally, TUS are performed using a field test instrument, which in most situations will be a temperature data logger. For static batch ovens, a static data logger is positioned externally to the furnace. Long thermocouples are trailed into the furnace heating chamber connected directly to the TUS frame.

Figure 1. Typical TUS setup for a static batch furnace. Twenty channel external data loggers connected directly to a nine-point TUS frame used to measure the temperature uniformity over the volumetric working volume of the furnace. | Image Credit: PhoenixTM
Figure 2. Thru-process TUS monitoring system. The data logger shown is located inside the thermal barrier, which travels with the TUS frame through the furnace. | Image Credit: PhoenixTM

For continuous or semi-continuous modular processes, this trailing thermocouple method is difficult if not impossible. For these furnaces, the preferred method of temperature monitoring is “thru-process”: the data logger is connected to a TUS frame, traveling with the load through the furnace. To protect the data logger from the hostile process conditions (including heat, pressure, steam, water, salt, or oil) the data logger is encased in a thermal barrier designed for the process in hand (Figure 2).

Calibration Accuracy Requirements

In either static or continuous processing conditions, the accuracy of the temperature monitoring system is dependent upon the combined accuracy of the field test instrument (data logger) and the temperature sensor thermocouples.

Both aspects of the monitoring system must be strictly controlled for AMS2750H compliance. The field test instrument data logger needs to have a calibration accuracy of ±1.0°F or ±0.1% of temperature reading, whichever is greater (Table 7 in AMS2750H), and a readability of ±0.1°F. The most common base metal thermocouples (K and N) used will themselves need to have a calibration accuracy of ±2.0°F or ±0.4% (percent of reading or correction factor °F, whichever is greater) as defined in Table 1 of the AMS2750H specification.

Field Measurement Accuracy

For process monitoring, thermocouples are generally the preferred temperature sensor when considering accuracy, robust operation, cost, and availability. It is important to fully understand the working limitations of the sensor technology from measurement accuracy attainable on the heat treat shop floor to ensure they are compensated for.

The theory of the thermocouple is traced back to a German Physicist, Thomas Seebeck in 1821. The “Seebeck effect” describes the generation of electrical voltage when a temperature difference exists between two dissimilar electrical conductors (metals). The resulting milivoltage (mV) is defined by actual temperature experienced at the measurement junction. For any given thermocouple, the measured mV can be converted to a temperature using standardized Seebeck voltage curves, commonly documented in thermocouple reference tables.

Figure 3. Basic thermocouple measurement circuit showing critical hot and cold junctions. Image Credit: PhoenixTM

Using the Seebeck principle, the thermocouple consists of two wires of dissimilar metals joined at the measurement point, known as the hot junction. The output voltage from the sensor is proportional to the temperature difference between the hot junction and the point of voltage measurement, known as the cold junction. It is important to recognize that a thermocouple measures temperature difference, not an absolute temperature. The basic principle of how a thermocouple measurement circuit operates is shown in Figure 3.

Critical Cold Junction Measurement

A common misconception is that thermocouple accuracy only needs to be accounted for at the hot junction. As previously mentioned, the thermocouple measurement is reliant on the temperature reading at the hot junction offset against the temperature of the cold junction. From an electronics level, the cold junction is where the thermocouple wires connect to the copper/copper connection on the electronic circuit. The cold junction therefore may be inside the data logger or on the outside of the data logger, if universal thermocouple connectors are used (Cu sockets).

Therefore, to get a consistent accurate reading from the hot junction, it is important to consistently and accurately monitor the cold junction temperature so the measurement can be corrected using a method known as “cold junction compensation.” It is critical that the cold junction temperature sensor is located correctly to ensure that the true cold junction temperature is measured and applied.

Essential Accuracy in Real World

While the accuracy of many data loggers may appear to be acceptable on paper, this may not reflect the real world situation. Data logger temperature may not be stable, which can compromise temperature accuracy when proper cold junction compensation is not implemented. The calibration accuracy in a stable temperature-controlled laboratory, or while performing an in-situ calibration, is one thing, but is the field test instrument able to work accurately on the production floor with significant swings in temperature over the survey period? Do you know what temperature changes the data logger may be experiencing on your process floor (e.g., climatic variation during day/furnace heat up, loading and unloading actions)?

Remember, only a few degree change in the cold junction temperature may compromise the measurement accuracy enough to fail the TUS level being tested, if no compensation is undertaken or if the compensation temperature used does not accurately reflect the live cold junction temperature.

Cold Junction Compensation Logger Data

Data loggers designed with an essential accurate cold junction compensation technology, like those created by PhoenixTM, maintain measurement accuracy in every changing industrial environments. This design allows the data logger accuracy to be quoted at ±0.5°F (K and N) over the full operating temperature range of the PhoenixTM data logger family. For standard data loggers used in conventional thermal barriers (phase change heat sink), the accuracy is maintained over the operating range of 32°F to 176°F. For high temperature data loggers used in phased evaporation thermal barriers (water tank protection), the accuracy is provided over the operating range of 32°F to 230°F. As designed, the data logger will operate at 212°F (boiling water), so cold junction compensation is critical with the data logger ambient temperature changing from 70°F to 212°F during normal operation.

Take for example an external data logger with cold compensation technology with an operating temperature range of 32°F to 131°F (see PTM4220 in Figure 1). On a production floor, users can safely operate, relying on the cold junction compensation to address temperature fluctuations in the processing environment.

Figure 4. Effect of changing physical data logger temperature on the thermocouple measurement with and without cold junction compensation measure a stable process temperature of 1470°F. Image Credit: PhoenixTM

Additionally, the thermocouple socket in the data logger case is connected directly to the measurement board of the data logger using thermocouple wire of the designated type (e.g., type K). A thermistor temperature sensor accurately monitors the connector temperature (±0.18°F) providing an accurate record of the cold junction. The connector is located inside the data logger cavity, protected from rapid environmental temperature changes, and is compact and isothermal. As such, the thermistor temperature accurately reflects the cold junction of each unique thermocouple connection. This temperature provides an accurate cold junction temperature compensation to maintain measurement accuracy with any internal data logger temperature variation (Figure 4).

Thermocouple Accuracy

Figure 5. Nonexpendable mineral insulated thermocouple type K (0.06 inch) or N (0.08 inch). UHT alloy sheathed insulated hot junction, terminating in miniature plug. Maximum temperature 2192°F ANSI MC96.1 Special Limits (±2.0 °F or ± 0.4%, whichever is highest). Image Credit: PhoenixTM

To maximize measurement accuracy, it is important that thermocouples are selected with the highest accuracy and manufactured to resist damage from thermal cycling at elevated temperatures.

For thru-process monitoring, short thermocouple lengths are required to connect the data logger within the thermal barrier and the TUS frame. As such, nonexpendable (see AMS2750H 2.2.36, Table 3) thermocouples can be employed with ease. Robust mineral insulated thermocouples (MIMS) (Figure 5), typically type K or N, can be permanently fixed to the TUS frame. This both reduces setup time and guarantees that thermocouple positions are consistent for periodic TUS work as defined (see AMS2750H 3.1.7, Table 5).

Barring physical damage, the mineral insulated thermocouples can be used unrestricted for up to three months (type K) and three months or longer (type N) if recalibration is successful at the three-month check.

Data Logger and Thermocouple Correction Factors

The PhoenixTM system allows both data logger and thermocouple correction factors to be applied automatically to the raw survey temperature data, maximizing measurement accuracy. The data logger correction factors can be read directly from the onboard digital data logger calibration file. Thermocouples are available with comprehensive calibration certificates providing corrections factors at multiple set temperatures across the required measurement range.

Figure 6. Schematic of the linear interpolation method (AMS2750 accepted) of calculating thermocouple correction factors over the entire calibration range of the thermocouple. Every TUS measurement in therefore corrected accurately against matching calibration offset data. Image Credit: PhoenixTM

For both data logger and thermocouples, correction factors are interpolated across the complete calibration range using the linear method as permitted by AMS2750H 3.1.4.8 (Figure 6). This approach means that the accuracy of the entire TUS dataset is guaranteed compared to applying a single correction factor calculated at a single nominated temperature, which may not truly reflect the complete temperature range.

Summary

To guarantee the accuracy of both temperature profile and TUS data, it is important that the field test instrument data logger not only provides the desired calibration accuracy but is able to work accurately in a production environment. For thermocouple systems, accurate cold junction compensation offers critical peace of mind to correct for changes in the operating temperature characteristics of the data logger during use.

Data logger and thermocouple correction factors should be implemented to maximize measurement accuracy. As discussed, the use of linear interpolation method ensures that correction factors calculated over the entire measurement range are implemented providing full data accuracy.

About The Author:

Dr. Steve Offley
Product Marketing Manager
PhoenixTM Ltd.

Dr. Steve Offley, “aka Dr O” is a product marketing manager with PhoenixTM Ltd. with 30 years of experience of temperature monitoring in the industrial thermal processing market.

For more information: Contact Steve Offley at steve.offley@phoenixtm.com.

Achieving Accurate Measurements in Real Heat Treat Production Read More »

Digitalization Propels Heat Treating to Industry of the Future

If you work in a standards-driven industry, you may already feel the imperative of digitalization. In today’s Technical Tuesday, Mike Loepke, head of Nitrex Software & Digitalization, posits how, even if you aren’t necessitated to track compliance digitally, you are probably looking to synthesize and leverage the strengths of multiple advanced operations — furnace and process record-keeping, knowledge of furnace past operations, juggling different new equipment capabilities — across just one platform. In other words, you are looking to bring digitalization system management to your operations.

This informative piece was first released in Heat Treat Today’s December 2024 Medical & Energy Heat Treat print edition.


The Future of Heat Treatment Relies on Digitalization

The ultimate goal for heat treaters, whether commercial or captive, is to uphold the quality of their product and meet client expectations while remaining profitable. Digitalization supports these efforts as it synthesizes and presents detailed, transparent, and accessible data that allows heat treaters to better manage their equipment, processes, and product quality. In addition, the collection of detailed information can serve as a database of knowledge to be used by the next generation of heat treaters, supporting future viability and advancement in the field.

There are necessary steps to take to establish a digital solution and essential components to look for when choosing a software platform that assists heat treaters in optimizing equipment and processes, effectively creating the digitalization of the heat treat operations. Let’s explore these now.

How Digitalization Optimizes Heat Treatment Processes

Digitalization in the heat treatment industry relies on the integration of industrial internet of things (IIoT) technologies with traditional and modern heat treatment processes. Using enabling devices such as sensors, modern connectivity methods, analytics, machine learning, and IIoT software platforms, it is possible for heat treaters to collect and process data that, after analysis, drives informed decisions to optimize equipment, processes, and product quality. To put a finer point on it, digitalization occurs when a manufacturing system is digitally integrated to capture and preserve human experience and knowledge, forming a holistic virtual representation of heat treat operations.

Figure 1. QMULUS Shop Layout enables visual inspection of the current production status, the location of goods and parts, as well as the real-time status of assets and their ongoing processes.
Source: Nitrex

While digitalization varies from industry to industry and plant to plant, there are some common ways in which heat treaters can employ digital technologies to build such a system. Firstly, digitally integrated solutions can optimize process management and control. For example, when a sensor detects a temperature anomaly during a heat treatment process, the integrated software platform picks up that reading, analyzes it in real time, recognizes it as an error based on historical data or programmed parameters, and alerts the operator.

This integration also facilitates predictive, condition-based maintenance. For example, if collected data and analysis suggests that a furnace is behaving abnormally, the system can automatically generate a work order along with a list of potential failure causes, so that a technician can troubleshoot, identify, and correct small issues — such as a failing thermocouple — before they impact quality or result in equipment failure. By addressing these proactively, heat treaters can avoid extended periods of costly unplanned downtime and ensure continuous operation.

Secondly, artificial intelligence through machine learning plays a crucial role in optimizing quality control in a digitalized system. By analyzing data collected during heat treating processes, it learns to detect patterns and identify anomalies. As in the examples above, this capability enables the system to identify deviations from the desired outcomes, allowing heat treaters to quickly rectify any issues before they impact quality.

Figure 2. The heart of the IIoT data platform needs to be thoughtfully planned and designed. Illustrated are 5 steps to follow to ensure the cloud data system properly engages with the data generated from your specific heat treat operations, ultimately delivering actionable insights. Step 1 depicts the various data sources; Step 2 shows the data transformation, integration, and processing stages; Step 3 highlights the central QMULUS database where data is indexed and organized; and Steps 4 and 5 demonstrate how data is further processed, distributed, and accessed by different end-users.
Source: Nitrex

Thirdly, algorithms can be programmed into a comprehensive management system to identify the most energy-efficient operating conditions for the heat treating process, helping heat treaters reduce their carbon footprint, minimize energy costs, and comply with sustainability goals.

In addition to these types of operational advantages, digitalization technologies can also be used to create a database of knowledge before experienced operators and experts leave the workforce. Traditionally, a handful of experts in the plant oversee the furnaces and equipment and understand how to best control and maintain them based on experience. However, passing down this knowledge to the next generation of heat treaters can take years, which may not be possible due to a company’s workflow demands and cost pressures. Digitalization addresses this challenge by creating a streamlined and accessible database of knowledge, offering less experienced operators and technicians immediate access to detailed information about what may be happening in the equipment or process for an issue at hand. This ensures that essential insights are not lost and enables quicker problem-solving and decision-making on the shop floor.

Making the Digitalization Transformation

While digitalization presents obvious advantages, the heat treatment industry, often conservative in its approach to technology, has some initial work and investment required before realizing the full benefits.

Going “paperless” in order to unlock the full potential of the available data is an important first step. All reports, histories, drawings, and other paperwork associated with equipment, processes, maintenance activities, product quality, and other relevant information should be digitized to provide a comprehensive view of both historical and current data.

Connectivity and integration between machine and higher-level systems are essential for effective data acquisition, monitoring, and remote control. SCADA systems, Manufacturing Execution Systems (MES), and other higher-level systems are rich sources of machine and process data. Gathering and analyzing this data can provide actionable insights that operators can use to make smarter decisions about the control and maintenance of equipment and processes.

Figure 3. A comprehensive overview displays all detected control loop anomalies, indicating possible root causes as well as recommended actions. Incorporating feedback from the responsible maintenance personnel further improves accuracy and delivers more effective recommendations for future occurrences.
Source: Nitrex

Finally, just having data is not enough. The data must be accessible, transparent, and relevant to be valuable. Achieving a complete picture of all the collected data, known as data consolidation, is necessary.

To build an IIoT platform with a well-architectured data engine, heat treaters should begin by identifying and understanding the different sources of data provided by sensors and high-level systems. This involves integrating the data through interfaces adapted to the data type and source, as well as documenting the integrated data sources, data fields, and data streams. Next, a “data lake” should be created to store the collected raw data. From this foundation, a data warehouse can be established to store enriched or analyzed data, derived values, data models, and forecasts in an organized way. For heat treaters, this type of contextualized data might be grouped by parts, loads, or orders.

Once the data engine is in place, the information stored in the data warehouse must be presented in a way that makes sense to operators and technicians for them to make informed decisions for heat treatment processes. To facilitate this, a universal data interface should be considered.

Building from this well-architectured data engine, the IIoT platform can then be expanded with statistical analytics, remote monitoring, KPI tracking, machine learning, artificial intelligence, and other applications to optimize processes and increase profitability.

What Heat Treaters Need in a Digitalization Solution

Harnessing modern technologies tomake digitalization a reality presents heat treaters with the opportunity to implement a solution based on a complete and well documented data system. It also means that the solution creates a holistic solution to data analysis, interpretation, reporting, and action that supports the real-world actions of heat treaters on the plant floor and in the office.

For this reason, a digitalization solution that has cloud and on-premises allows real-time access to analysis and alert messages for operators on the floor as well as managers who are away from the plant, ensuring quick problem-solving and maximum uptime in the event of process or machine issues.

Additionally, heat treaters should look for a solution that offers the freedom to integrate all the various platforms and equipment from which data are gathered from. These may include relevant machinery and production data from the shop floor as well as third-party and custom controllers. This flexibility to synthesize information from multiple sources will ensure the digitalization efforts lead to a comprehensive solution with actionable process overviews, recipe control, batch tracking, and other customization options.

To further this intent of a holistic solution, heat treaters should consider various data capabilities with different portal views, such as a manufacturer portal, a plant portal, and a client portal. However, considering the historic value of a comprehensive software solution, it may be worthwhile to consider how each user could transfer direct feedback and add new rules into the system, creating a repository of knowledge that bridges the knowledge of outgoing generations to future heat treaters.

Finally, any platform that directs the digitalization of a plant must prioritize robust security measures. Several features to look for are:

  • enhanced encryption standards to keep data confidential and tamper-proof during transmission and storage;
  • secure protocols based on industry best practices to safeguard data integrity;
  • a granular access control system (ACS) to allow IT administrators to define and manage user permissions of authorized personnel, thereby minimizing the risk of data breaches and unauthorized data manipulation; and
  • intrusion detection and prevention systems to continuously monitor network and system activities, enabling instant identification and mitigation of suspicious behavior. This serves as an additional layer of defense against potential cyber threats.

Beyond the software setup, be sure to use best practices by conducting regular security audits to assess the platform’s vulnerabilities and ensure compliance with evolving cybersecurity standards. While digitalization of heat treat operations may seem like a task for the next generation to complete, secure software options that integrate the hard work of digitizing plant activities can make this endeavor just a step away.

About the Author:

Mike Loepke
Head of Nitrex Software & Digitalization
Nitrex

Drawing from a background in Mathematics and Physics, coupled with extensive R&D experience and metallurgical modeling, Mike Loepke specializes in AI and process prediction. He has led Nitrex’s initiative in developing QMULUS, a pioneering IIoT cloud-based platform. Mike’s relentless pursuit of knowledge keeps him at the forefront of evolving technology.

.

For more information: Contact Mike at mike.loepke@nitrex.com



Digitalization Propels Heat Treating to Industry of the Future Read More »

Ask the Heat Treat Doctor®: What Does “Bright and Shiny” Really Mean?

The Heat Treat Doctor® has returned to offer sage advice to Heat Treat Today readers and to answer your questions about heat treating, brazing, sintering, and other types of thermal treatments as well as questions on metallurgy, equipment, and process-related issues.


Contact us with your Reader Feedback!

Clients often want to know or specify that their component part surfaces are “bright” or “shiny” or “clean.” Other times they desire to have a surface condition that is “scale free” or “oxide free” after heat treatment. But how, if at all, can we quantify what these terms mean? Let’s learn more.

“Shiny” and “bright” are words that are highly subjective. This is often a source of confusion not only for the heat treater, but the manufacturer and, in some cases, even the end user of the products. Heretofore, the answer depended on one human being’s interpretation as opposed to another, and evaluations depend not only on the type of material but also the mill practices used, manufacturing methods employed, heat treatment processes, and the level and type of contamination introduced before and after processing.

Traditional Approach

Figure 1. Temper color chart atmosphere or tempering in air or an “inert” gas such as nitrogen. Source: Abbott Furnace Company

Traditionally, we have relied on color charts (Figure 1) to tell the approximate temperature at which discoloration took place, that is, an oxide formed on the (steel, stainless steel, or tool steel) surface of a component part. This method is still in use today when cooling parts in a furnace

As mentioned, the perception and interpretation of color is different for different people. Lighting (natural light or plant illumination), the environment in which one views color, eye fatigue, the age of the observer, and a host of other factors influences color perception. But even without such physical considerations, each of us interprets color based on personal perception. Each person also verbally describes an object’s color differently. As a result, objectively communicating a particular color to another person without using some type of standard is difficult.

There also must be a way to compare one color to the next with accuracy.

New Approach

Today, portable spectrophotometers (Figure 2) are available to measure color and help quantify brightness measurements. These types of devices are designed to meet various industry standards including:

  • Whiteness (e.g., ASTM E313, CIE)
  • Gray scale (e.g., ISO 105 staining, color change)
  • Opacity (e.g., contrast ratio, Tappi strength — SWL, Summed, Weighted Sum)
  • Yellowness (e.g., ASTM E313, D1925)
Figure 2. X-Rite MA-5 QC multi-angle spectrophotometer. Source: X-Rite

In simplest terms, a spectrophotometer is a color measurement device used to capture and evaluate color. Every object has its own reflectance, or the amount of light it reflects, and transmittance, or the amount of light it absorbs. A reflectance spectrophotometer shines a beam of light and measures the amount of light reflected from different wavelengths of the visible spectrum, while a transmission spectrophotometer measures how much light passes through the sample. Spectrophotometers can measure and provide quantitative analysis for just about anything, including solids, liquids, plastics, paper, metal, fabric, and even painted samples to verify color and consistency.

Spectrophotometers provide the solution to the subjective problem of interpreting the color of the surface of a component part that has been heat treated, brazed, or sintered because they explicitly identify the colors being measured; that is, the instrument differentiates one color from another and assigns each a numeric value.

As an example, the brightness of steel tubes annealed in a rich Exothermic gas atmosphere was measured against tubes that had not been processed (Figure 3). Having this definite measurement of the surface changes allowed the heat treater to provide their client with a definitive statement on the change after processing.

CIE Color Systems

The Commission Internationale de l’Eclairage (CIE) is an organization responsible for international recommendations for photometry and colorimetry. The CIE standardized color order systems include specifying the light source (illumination), the observer, and the methodology used to derive values for describing color, regardless of industry or use case.

Though spectrophotometers are the most common, for some applications colorimeters can also be used, but these are in general less accurate and less suitable for a heat treat environment.

There are three primary types of spectrophotometers on the market today used for print, packaging, and industrial applications: traditional 0°/45° (or 45°/0°) spectrophotometers, primarily used for the print industry; sphere (or diffuse/8°) spectrophotometers, primarily used in the packaging industry; and multi-angle (MA) spectrophotometers, for use in industrial environments. These instruments capture color information, and in some cases can capture appearance data (e.g., gloss).

Multi-angle (MA) spectrophotometers are best suited for measurements involving special surface effects, such as those found on metal surfaces and coatings and include those with surface contaminants and even can quantify cosmetic appearance. These are typically used on the shop floor, in the lab and in quality control, and even can be found in shipping areas.

MA spectrophotometers require users to verify five or more sets of L*a*b values or delta these terms). They typically have an aperture size of 12 mm, which is too large for measuring the fine detail that occurs in many small-scale industrial applications. Primary illumination is provided at a 45° angle. Some models have secondary illumination at a 15° angle.

Figure 3. Example of a product test — color and oxidation level test. Source: X-RIte

An application example for an MA spec trophotometer lies in their use for collecting colorimetric data on special effects coatings in the automotive industry, capturing reliable color data in cases where special effect coatings are used.

Final Thoughts

In this writer’s opinion, a spectrophotometer should be in every heat treat shop! You will be doing both yourself and your customers a valuable service and take the guesswork out of one of the most commonly asked questions – is it bright?

References

  • Herring, Dan H. Atmosphere Heat Treatment Volume 1. BNP Media, 2014.
  • X-Rite Pantone. “A Guide to Understanding Color.” Accessed October 10, 2024. https://www.xrite.com/learning-color-education/whitepapers/a-guide-to-understanding-color.
  • X-Rite Panatone. “Tolerancing Part 3: Color Space vs. Color Tolerance.” Accessed October 10, 2024. https://www.xrite.com/blog/tolerancingpart-3.
  • X-Rite Pantone. “X-Rite Portable Multi Angle Spectrophotometers.” Accessed October 10, 2024. https://www.xrite.com/categories/portable-pectrophotometers/ma-family.

About the Author

Dan Herring
“The Heat Treat Doctor”
The HERRING GROUP, Inc.

Dan Herring has been in the industry for over 50 years and has gained vast experience in fields that include materials science, engineering, metallurgy, new product research, and many other areas. He is the author of six books and over 700 technical articles.

For more information: Contact Dan at dherring@heat-treat-doctor.com.

For more information about Dan’s books: see his page at the Heat Treat Store.


Find Heat Treating Products And Services When You Search On Heat Treat Buyers Guide.Com


Ask the Heat Treat Doctor®: What Does “Bright and Shiny” Really Mean? Read More »

A New Era: Tracking Quality Digitally

What are advanced management systems and how does deep integrative system management software help automotive heat treaters improve processes while saving on time and unnecessary expenses? Explore the future of software technology for the management of heat treating operations in this Technical Tuesday by Sefi Grossman, founder and CEO of CombustionOS.


The heat treating industry is on the brink of a technological transformation. Just as the momentous adoption of websites and emails transformed the nature of work for manufacturers, the advanced software systems are thrusting us into a new era of simplicity, automation, and deep integrations.

This article explores how advanced systems — an application of ERP (enterprise resource planning) and MES (manufacturing execution systems) combined with the power of AI — is revolutionizing facility operations, enhancing quality, efficiency, and profitability.

What Are Advanced Systems?

Advanced systems simplify, streamline, and automate operations by lifting the data burden off of plant personnel. While most existing systems focus on the part inventory workflow, more advanced systems go beyond by directly integrating into the heat treat process to track at bin/tray/tree level.

This requires real-time scheduling control, barcode scanning, digitizing recipe and process (no more paper), and direct sensor/PLC integration. Because of its critical nature, an advanced system is most likely an on-premise and cloud “hybrid solution” that is not crippled by internet connectivity issues. This allows it to still utilize rapidly evolving cloud systems that provide external services like messaging, big data storage, and AI to name a few.

Precise Processing

Figure 1. CombustionOS developers spend extensive time with operators and plant managers to create interfaces that are intuitive and easy to use. Pictured is access to job data stats from a mobile device being used outside of the manufacturing plant.

Repeatable, accurate methods to ensure optimal time, temperature, and atmosphere of the decided heat treatment processes are possible with advanced systems.

Utilizing existing sensors and hardware interfaces, data is collected in short intervals, transformed into meaningful data formats, and stored in a database. Network technologies such as HTTP, Modbus, and other analog to AI technologies make this possible with minimum additional hardware. The data is managed locally on the facility network, and synchronized with cloud services for further processing, analysis, and long-term history storage.

With a close monitoring of all these variables, facilities can tighten acceptable specification ranges. Deep integration with equipment ensures that data flows seamlessly from sensors and devices to the central system.

This real-time data collection and processing enables facilities to monitor operations continuously and make informed decisions quickly. For example, integrating data from temperature sensors, pressure gauges, and other monitoring devices ensures that all critical parameters are tracked and managed effectively. Additionally, if a temperature reading deviates from the acceptable range, the system can immediately alert the relevant personnel, allowing them to take corrective action before it becomes a critical issue.

 In addition to quality assurance, integrated artificial intelligence tools optimize job scheduling. Unlike traditional date/time calendar methods, AI systems predict job completion times based on real-time process data. This is particularly useful for roller furnace setups, where continuous processing occurs, but it is also beneficial for batch furnaces. Optimized scheduling improves resource allocation and operational efficiency, ensuring that jobs are completed on time and to the required specifications. The difference between a “calculation algorithm” and AI is that, with AI, you do not have to pre-program it. It automatically learns and adjusts for known variability in your hardware and even the personnel that are operating the equipment.

Finally, the automation of these systems captures and records all necessary information accurately. This reduces the risk of non-compliance, improving the overall quality of the final product. For example, a Detroit-based heat treating facility reported that accessing real time data to ensure compliance with industry standards has allowed them to spend 40% less time on documentation tasks.

Figure 2. Having increased control over the process gives more peace of mind to operators that components perform as needed.

Alleviating Burden on Maintenance and Inventory

Predictive maintenance is one of the most significant applications of AI in the heat treating industry. Traditional maintenance schedules are often based on fixed intervals, which can lead to unnecessary downtime or unexpected failures. AI driven predictive maintenance, on the other hand, uses real-time data to determine the optimal times for maintenance activities. This approach not only reduces downtime but also extends the lifespan of equipment.

A Detroit-based heat treating facility implemented an AI-driven predictive maintenance system (PMs) and saw a 25% reduction in equipment downtime. By analyzing data from critical parts, inventory, process tracking history, and various sensors, the AI system could predict when components were likely to fail, allowing the maintenance team to inspect and address issues proactively beyond their standard PMs. This not only improved operational efficiency, but also saved significant costs associated with emergency repairs and unplanned downtime.

Additionally, the integration of QR codes for inventory and process tracking enables quick and accurate data entry compared to manual logging. For instance, when racking parts out of bins, operators can simply scan QR codes, which automatically update the system with the relevant information. This not only speeds up the process but also minimizes the chances of human error.

Reducing Operational Costs

The adoption of advanced ERP and MES systems has led to substantial cost savings for many facilities. These systems reduce operational costs through the implicit automated integrations that technologies like CombustionOS bring. Here are just a few ways that operational costs have been cut:

  • Decreasing shipping and receiving management from three to just one employee
  • Minimizing rework costs by timely process alerts
  • Reducing personnel by replacing constant manual oversight with accurate, digital tracking systems
  • Lowering administrative costs by utilizing a more efficient and accurate invoice automation platform

Case Study: A client reported comprehensive cost savings, including a 20% reduction in shipping and receiving time, fewer logistics and furnace operators needed, a 33% decrease in rework costs, a 15% savings in maintenance costs, and a 25% reduction in accounting overhead. These efficiencies translate into substantial payroll savings and improved profitability.

How To Implement

Figure 3. When racking parts out of bins, operators can simply scan QR codes, which automatically update the system with the relevant information.

One of the most significant advancements in heat treating technology is the deep integration with various equipment types. Unlike traditional ERP systems, which often lack true integration, advanced systems work backwards from equipment data, building ERP functionalities around this integration to ensure seamless and accurate data flow.

First, there are advanced systems that can handle data from both digital and analog sensors. So, for heat treaters who are juggling a variety of sensors and systems, looking for an integrative advanced system that has adaptability will ensure compatibility with existing equipment while keeping an eye on cost. Facilities can continue using their current equipment while benefiting from advanced monitoring and control capabilities.

Second, advanced ERP/MES systems can take collaboration with multiple vendors. Rather than uproot current systems and relationships, work with an advanced systems provider who is able to collaborate with other software and systems. Advanced ERP/MES systems provide comprehensive solutions that include deep equipment integration and full ERP functionalities. This approach reduces the complexity and cost of integration, ensuring that all components work together seamlessly.

Key Applications

Most operations in a heat treat department will benefit from advanced systems due to the time-saving automations that the system integrates. But many heat treaters are looking to adapt and integrate older systems and often more complex designs, like roller hearth furnaces. Here are some steps that experts will take to guide you through to make the digital integration smooth and effective:

  1. First, it is important to understand you don’t need to boil the ocean. Starting with a more advanced inventory tracking system that employs barcodes can set the underpinnings for a more integrated system while providing immediate benefits to your logistics.
  2. Then, it is also key to get a deep understanding of your current process and map out your operational workflow. Using a flowchart program helps
    visualize the process to make sure all stakeholders are on the same page.
  3. Some aspects of your current process are probably outdated (perhaps created by someone who is no longer at the company), while others are key to the core of how you operate. Understanding the difference is crucial to make sure you unlock potential automation without disturbing your core process and flow.
  4. You’ll then need to prepare every required form, document, chart etc. that you use in the operation. For process control, recipes, and lab testing, provide many parts/iterations to capture the complexity.
  5. Finally, take inventory of any existing digital systems you have adopted, like inventory tracking, spreadsheets, or custom software. The existing system
    network, including servers, Wi-Fi setup, and hardware (PCs, printers, scanners, etc.) will be utilized as much as possible in the transition to reduce the need to purchase and set up different equipment.

Conclusion

The future will require constant innovations and thoughtful leveraging of increasingly advanced systems. Unlike static, homegrown, or “pieced together” solutions, the most advanced systems are constantly updated with new features, ensuring they remain at the cutting edge of technology. Engaging directly with plant personnel to understand their needs and challenges allows systems like CombustionOS to evolve and improve continuously.

The heat treating industry is on the cusp of a technological transformation, driven by advancements in ERP, MES, and AI. These technologies offer the potential to enhance quality, efficiency, and profitability, making them essential for the future of manufacturing. By embracing automation, integrating advanced AI capabilities, and committing to continuous innovation, the industry can achieve new levels of operational excellence.

About the Author:

Sefi Grossman
Founder & CEO
CombustionOS
Source: Author

Sefi Grossman has been at the forefront of technology revolutions for the past two decades and has been leading the technology company CombustionOS for nearly seven years.


For more information: Contact Sefi at sefi@combustionos.com.


Find Heat Treating Products And Services When You Search On Heat Treat Buyers Guide.Com

A New Era: Tracking Quality Digitally Read More »

Overcoming Quality Challenges for Automotive T6 Heat Treating

Three elements in the T6 aluminum heat treatment process — high temperature solution heat treatment, drastic temperature change in the water quench, and a long age hardening process — challenge accurate temperature monitoring. Thru-process technology gives in-house heat treaters the power to control these variables to overcome the unknowns. In the following Technical Tuesday article, Dr. Steve Offley, “Dr. O”, product marketing manager at PhoenixTM, examines the path forward through the challenges of aluminum heat treating.


Aluminum Processing Growth

In today’s automotive and general manufacturing markets, aluminum is increasingly becoming the material of choice, being lighter, safer, and more sustainable. Manufacturers looking to replace existing materials with aluminum are needing new methodology to prove that thermal processing of aluminum parts and products is done to specification, efficiently and economically.

To add strength to pure aluminum, alloys are developed by the addition of elements dissolved into solid solutions employing the T6 heat treatment process (Figure 1). The alloy atoms create obstacles to dislocate movement of aluminum atoms through the aluminum matrix. This gives more structural integrity and strength.

FIgure 1. Critical temperature phase transitions of the T6 aluminum heat treatment process
Source: PhoenixTM

Process temperature control and uniformity is critical to the success of T6 heat treat to maximize the solubility of hardening solutes such as copper, magnesium, silicon, and zinc without exceeding the eutectic melting temperature. With a temperature difference of typically 9–15°F, knowing the accurate temperature of the product is essential. Control of the later quench process (Figure 1, Phase 3) is also critical not only to facilitate the alloy element precipitation phase but also to prevent unwanted part distortion/warping and risk of quench cracking.

T6 Process Monitoring Challenges

The T6 solution reheat process comes with many technical challenges where temperature profiling is concerned. The need to monitor all three of the equally important phases — solution treatment, quench, and the age hardening process — makes the trailing thermocouple methodology impossible.

Figure 2. Thru-process temperature monitoring of the three T6 heat treatment phases
Source: PhoenixTM

Even when considering applying thru-process temperature profiling technology, sending the data logger through the process, protected in a thermal barrier (Figure 2), the T6 heat treat process comes with significant challenges. A system will not only need to protect against heat (up to 1020°F) over a long process duration but also withstand the rigors of being plunged into a water quench. Rapid temperature transitions create elevated risk of distortion and warping which need to be addressed to give a reliable and robust monitoring solution.

Certain monitoring systems can provide protection to the data logger at 1022°F for up to 20 hours (Figure 3).

Figure 3. Thru-process temperature profiling system installed in the product cage monitoring the T6 heat treatment (solution treatment, quench, and age hardening) of aluminum engine blocks

Thermal Protection Technology

To meet the challenges of the T6 heat treat process, the conventional thermal barrier design employing microporous insulation is replaced with a water tank design, with thermal protection using an evaporative phase change temperature control principle. Evaporative technology uses boiling water to keep the high temperature data logger (maximum operating temperature of 230°F) at a stable operating temperature of 212°F as the water changes phase from liquid to steam. The advantage of evaporative technology is that a physically smaller barrier is often possible. It is estimated that with a like for like size (volume) and weight, an evaporative barrier will provide in the region of twice the thermal protection of a standard thermal barrier with microporous insulation and heat sink. The level of thermal protection can be adjusted by changing the capacity of the water tank and the volume of water. Increasing the volume of water increases the duration at which the T6 temperature barrier will maintain the data logger temperature of 212°F before it is depleted by evaporation losses.

The TS06 thermal barrier design (Figure 4) incorporates a further level of protection with an outer layer of insulation blanket contained within a structural outer metal cage. The key role of this material is to act as an insulative layer around the water tank to reduce the risk of structural distortion from rapid temperature changes both positive and negative in the T6 process.

Figure 4. TS06 thermal barrier design showing water tank, housing the data logger at its core, installed within structural frame containing the insulation blanket surface layer; water tank shown with traditional compression fitting face plate seal
Source: PhoenixTM

Obviously, the evaporative loss rate of water is governed by the water tank geometry. A cube shaped tank will provide the best performance, but this may need to be adapted to meet process height restrictions. A TS06 thermal barrier with dimensions 8.5 x 18.6 x 25.2 inches (H x W x L) offering a water capacity of 3.5 US gallons provides 11 hours of protection at 1022°F. A larger TS06 with approximately twice the capacity 12.2 x 18.6 x 25.2 inches (H x W x L) and 7.7 US gallons gives approximately twice the protection (20 hours at 1022°F).

Innovative IP67 Sealing Design

Passing through the water quench, the data logger needs to be protected from water damage. This is achieved in the system design by combining a fully IP67 sealed data logger case and water tank front face plate through which the thermocouples exit. Traditionally in heat treatment applications, mineral insulated thermocouples are sealed using robust metal compression fittings. Although reliable, the compression seals are difficult to use, requiring long set-up times. The whole uncoiled straight cable length must be passed through the tight fitting which, for the 10 x 13 ft thermocouples, takes some patience. Thermocouples can be used and installed for multiple runs, if undamaged. Unfortunately, as the ferrule in the compression fitting bites into the MI cable, removal of the cable requires the thermocouple to be cut, preventing reuse.

To overcome the frustrations of compression fitting, an alternative innovative thermocouple sealing mechanism has been designed for use on the T6 thermal barrier (Figure 5).

Figure 5. TS06 thermal barrier IP67 bi-directional rubber gasket seal; installation of mineral-insulated (MI) thermocouples and RF antenna aerial

Thermocouples can be slotted easily and quickly, tool free, into a precision cut rubber gasket without any need to uncoil the thermocouple completely. The rubber gasket has a unique bi-directional seal, allowing both sealing of each thermocouple but also sealing of the clamp face plate to the data logger tray, which is then secured to the water tank with a further silicone gasket seal. The new seal design allows thermocouples to be uninstalled and reused, reducing operating costs significantly.

Accurate Process Data considerations

The T6 applications come with a series of monitoring challenges which need to be considered carefully to guarantee the quality of the data obtained. Although the complete process time of the three phases can reach up to 10 hours, it is necessary to use a rapid sample interval (seconds) to provide a sufficient resolution. The data logger is designed to facilitate this with a minimum sample interval of 0.2 seconds over 20 channels and memory size of 3.8 million data points, allowing complete monitoring of the entire process. A sample interval of 0.2 seconds provides sufficient data points on the rapid quench cooling curve. The high resolution allows full analysis and optimization of the quench rate to achieve required metallurgical transitions yet avoid distortion or quench cracking risks.

Employing the phased evaporation thermal barrier design, the high temperature data logger with maximum operating temperature of 230°F will operate safely at 212°F. During the profile run, the data logger internal temperature will increase from ambient temperature to 212°F. To allow the thermocouple to accurately record temperature, the data logger offers a sophisticated cold junction compensation method, correcting the thermocouple read out (hot junction) for anticipated internal data logger temperature changes.

Data logger and thermocouple calibration data covering the complete measurement range (not just a single designated temperature) can be used to create detailed correction factor files. Correction factors are calculated by interpolation between two known calibration points using the linear method as approved by CQI-9 and AMS2750G. This method ensures that all profile data is corrected to the highest possible accuracy. 

Addressing Real-Time, Thru-Process Temperature Monitoring Challenges

For a process time as long as the T6, real-time monitoring capability is a significant benefit. The unique two-way RF telemetry system used on the PhoenixTM system helps address the technical challenges of the three separate stages of the process. The RF signal can be transmitted from the data logger through a series of routers linked back to the main coordinator connected to the monitoring PC. The wirelessly connected routers are located at convenient points in the process (solution treatment furnace, quench tank, aging furnace) to capture all live data without any inconvenience of routing communication cables.

A major challenge in the T6 process is the quench step from an RF telemetry perspective. An RF signal cannot escape from water in the quench tank. To overcome this limitation, a “catch up” feature is implemented. Once the system exits the quench and the RF signal is re-established, any previously missing data is retransmitted guaranteeing full process coverage.

Process Quality Assurance and Validation

In the automotive industry, many operations will be working to the CQI-9 special process heat treat system assessment accreditation. As defined by the pyrometry standard, operators need to validate the accuracy and uniformity of the furnace work zone by employing a temperature uniformity survey (TUS).

The thru-process monitoring principle allows for an efficient method by which the TUS can be performed employing a TUS frame to position a defined number of thermocouples over the specific working zone of the furnace (product basket). As defined in the standard with particular reference to application assessment process Table C (aluminum heat treating), the uniformity for both the solution heat treatment and aging furnace needs to be proven to satisfy ±10°F of the threshold temperature during the soak time.

Complementing the TUS system, the Thermal View Survey software provides a means by which the full survey can be set up automatically allowing routine full analysis and reporting to the CQI-9 specification as shown in Figure 6.

Figure 6. View of TUS for T6 aluminum processing in Phase 1 Solution Re-heat
Source: PhoenixTM

Interestingly, a significant further benefit of the thru-process principle is that by collecting process data for the whole process, many of the additional requirements of the process Table C can be achieved with reference to the quench. From the profile trace, key criteria such as quench media temperature, quench delay time, and quench cooling curve can be measured and reported with full traceability during the production run.

Summary

To fully understand, control, and optimize the T6 heat treat process, it is essential the entire process is monitored. Thru-process monitoring solutions, designed specifically, allow not only product temperature profiling of all the solution heat treatment, water quench, and age hardening phases, but also comprehensive temperature uniformity surveying to comply with CQI-9.

About the Author:

Dr Steve Offley (“Dr O”), Product Marketing Manager, PhoenixTM

Dr. Steve Offley, “Dr. O,” has been the product marketing manager at PhoenixTM for the last five years after a career of over 25 years in temperature monitoring focusing on the heat treatment, paint, and general manufacturing industries. A key aspect of his role is the product management of the innovative PhoenixTM range of thru-process temperature and optical profiling and TUS monitoring system solutions.

For more information: Contact Steve at Steve.Offley@phoenixtm.com.


Find Heat Treating Products And Services When You Search On Heat Treat Buyers Guide.Com


Overcoming Quality Challenges for Automotive T6 Heat Treating Read More »

Traveling through Heat Treat: Best Practices for Aero and Auto

Thinking about travel plans for the upcoming holiday season? You may know what means of transportation you will be using, but perhaps you haven't considered the heat treating processes which have gone into creating that transportation. 

Today’s Technical Tuesday original content round-up features several articles from Heat Treat Today on the processes, requirements, and tools to keep planes in the air and vehicles on the road, and to get you from one place to the next. 


Standards for Aerospace Heat Treating Furnaces 

Without standards for how furnaces should operate in the aerospace, there could be no guarantee for quality aerospace components. And without quality aerospace components, there is no guarantee that the plane you're in will be able to get you off the ground, stay in the air, and then land you safely at your destination.

In this article, written by Douglas Shuler, the owner and lead auditor at Pyro Consulting LLC, explore AMS2750, the specification that covers pyrometric requirements for equipment used for the thermal processing of metallic materials, and more specifically, AMEC (Aerospace Metals Engineering Committee).

This article reviews the furnace classes and instrument accuracy requirements behind the furnaces, as well as information necessary for the aerospace heat treater.

See the full article here: Furnace Classifications and How They Relate to AMS2750

Dissecting an Aircraft: Easy To Take Apart, Harder To Put Back Together 

Curious to know how the components of an aircraft are assessed and reproduced? Such knowledge will give you assurance that you can keep flying safely and know that you're in good hands. The process of dissecting an aircraft, known as reverse engineering, can provide insights into the reproduction of an aerospace component, as well as a detailed look into the just what goes into each specific aircraft part.

This article, written by Jonathan McKay, heat treat manager at Thomas Instrument, examines the process, essential steps, and considerations when conducting the reverse engineering process.

See the full article here: Reverse Engineering Aerospace Components: The Thought Process and Challenges

Laser Heat Treating: The Future for EVs?

If you are one of the growing group of North Americans driving an electric vehicle, you may be wondering how - and how well - the components of your vehicle are produced. Electric vehicles (EVs) are on the rise, and the automotive heat treating world is on the lookout for ways to meet the demand efficiently and cost effectively. One potential solution is laser heat treating.

Explore this innovative technology in this article composed by Aravind Jonnalagadda (AJ), CTO and co-founder of Synergy Additive Manufacturing LLC. This article offers helpful information on the acceleration of EV dies, possible heat treatable materials, and the process of laser heat treating itself. Read more to assess the current state of laser heat treating, as well as the future potential of this innovative technology.

See the full article here: Laser Heat Treating of Dies for Electric Vehicles

When the Rubber Meets the Road, How Confident Are You?

Reliable and repeatable heat treatment of automotive parts. Without these two principles, it’s hard to guarantee that a minivan’s heat treated engine components will carry the family to grandma’s house this Thanksgiving as usual. Steve Offley rightly asserts that regardless of heat treat method, "the product material [must achieve] the required temperature, time, and processing atmosphere to achieve the desired metallurgical transitions (internal microstructure) to give the product the material properties to perform it’s intended function."

TUS surveys and CQI-9 regulations guide this process, though this is particularly tricky in cases like continuous furnace operations or in carburizing operations. But perhaps, by leveraging automation and thru-process product temperature profiling, data collection and processing can become more seamless, allowing you better control of your auto parts. Explore case studies that apply these two new methods for heat treaters in this article.

See the full article here: Discover the DNA of Automotive Heat Treat: Thru-Process Temperature Monitoring


Find heat treating products and services when you search on Heat Treat Buyers Guide.com


Traveling through Heat Treat: Best Practices for Aero and Auto Read More »

Heat Treat Radio #91: Understanding the ±0.1°F Requirement in AMS2750, with Andrew Bassett

Where did the ±0.1°F AMS2750 requirement come from and how should heat treaters approach this specification, an important change that entails major buy-in? Andrew Bassett, president and owner of Aerospace Testing and Pyrometry, was at the AMS2750F meeting. He shares the inside scoop on this topic with Heat Treat Today and what he expects for the future of this standard.

Heat Treat Radio podcast host and Heat Treat Today publisher, Doug Glenn, has written a column on the topic, which you can find here; read it to understand some of the background, questions, and concerns that cloud this issue.

Below, you can watch the video, listen to the podcast by clicking on the audio play button, or read an edited transcript.


 



The following transcript has been edited for your reading enjoyment.

Doug Glenn: Andrew Bassett, president and owner of Aerospace Testing and Pyrometry, Inc., somewhere in eastern Pennsylvania. We don’t know because you’re on the move! What is your new address, now, by the way?

Contact us with your Reader Feedback

Andrew Bassett: We are in Easton, Pennsylvania at 2020 Dayton Drive.

Doug Glenn: Andrew, we want to talk a bit about this ±0.1°F debate that is going on. It was actually precipitated by the column that I wrote that is in the February issue.

I just wanted to talk about that debate, and I know that you’ve been somewhat involved with it. So, if you don’t mind, could you give our listeners a quick background on what we are talking about, this ±0.1°F debate.

Andrew Bassett: To be honest with you, being part of the AMS2750 sub team, one of the questions came up for us during the Rev F rewrite was this 0.1°F readability — wanting to kind of fix this flaw that’s been in the standard ever since the day that AMS2750 came out. With instrumentation, for instance, you have ±2°F (the equivalent would be 1.1°C). At 1.1°C, the question became, If your instrumentation does not show this 0.1 of a degree readability, how can you show compliance to the standards?

Andrew Bassett
President
Aerospace Testing and Pyrometry
Source: DELTA H

Then, it morphed into other issues that we’ve had in the previous revisions where we talk about precise temperature requirements, like for system accuracy testing: You’re allowed a hard number ±3° per Class 2 furnace or 0.3% of reading, whichever is greater. Now, we have this percentage. With anything over 1000°F, you're going to be able to use the percentage of reading to help bring your test into tolerance. In that example, 1100°F, you’re about 3.3 degrees. If your instrumentation doesn’t show this readability, how are you going to prove compliance?

That’s what it all morphed into. Originally, the first draft that we proposed in AMS2750F was that all instrumentation had to have 0.1°F readability. We got some feedback (I don’t know if I want to say “feedback” or "pitchforks and hammers") that this would be cost-prohibitive; most instrumentation doesn't have that readability, and it would be really costly to go out and try to do this. We understood that. But, at the end of the day, we said: The recording device is your permanent record, and so that’s what we’re going to lean on. But we still had a lot of pushback.

We ended up putting a poll out to AMEC and the heat treating industry to see what their opinions were. We said that with the 0.1 readability (when it came to a percentage reading), recording devices would read hard tolerances. So, for instance, an SAT read at 3° would be just that, not "or .3% of reading."

There was a third option that we had put out to the community at large, and it came back as the 0.1° readability for digital recorders, so that’s where we ran with the 0.1° readability.

When it was that big of an issue, we didn’t make the decisions ourselves; we wanted to put it out to the rest of the community. My guess is not everyone really thought the whole thing through yet. Now people are like, ok, well now I need to get this 0.1° readability.

Again, during the meetings, we heard the issues. Is 0.1° going to really make a difference to metal? If you have a load thermocouple that goes in your furnace and it reads 0.1° over the tolerance, does it fail the load? Well, no, metallurgically, we all know that’s not going to happen, but there’s got to be a line in the sand somewhere, so it was drawn at that.

"...that hard line in the sand had to be drawn somewhere..."
Source: Unsplash.com/Willian Justen de Vasconcellos

That’s a little bit of the background of the 0.1° readability.

Doug Glenn: So, basically, we’re in a situation, now, where people are, in fact (and correct me if I’m wrong here),  potentially going to fail SATs or tests on their system because of a 0.1° reading, correct? I mean, it is possible, correct?

Andrew Bassett: Yes. So, when the 0.1° readability came out in Rev F, we gave it a two-year moratorium that with that requirement, you still had two more years. Then, when Rev G came out, exactly two years to the date, we still had a lot of customers coming to us, or a lot of suppliers coming back to us, and saying, “Hey, look, there’s a supply shortage on these types of recorders. We need to buy some time on this.” It ranged from another year to 10 years, and we’re like — whoa, whoa, whoa! You told us, coming down the pike before, maybe you pushed it down the road, whatever, probably Covid put a damper on a lot of people, so we added another year.

So, as of June 30th of 2023, that requirement is going to come into full play now. Like it or not, that’s where the standard sits.

Doug Glenn: So, you’re saying June 30th, 2023?

Andrew Bassett: Yes.

Doug Glenn Alright, that’s good background.

I guess there were several issues that I raised. First off, you’ve already hit on one. I understand the ability to be precise, but in most heat treatment applications, one degree is not going to make a difference, right? So, why do we push for a 0.1° when 1° isn’t even going to make a difference?

Andrew Bassett: We know that, and it’s been discussed that way. But, again, that hard line in the sand had to be drawn somewhere, and that was the direction the community wanted to go with, so we went with that. Yes, we understand that in some metals, 10 degrees is not going to make a difference, but we need to have some sort of line in the sand and that's what was drawn.

Doug Glenn: So, a Class 1. I was thinking the lower number was a tighter furnace. So, a Class 1 (±5), and you’re saying, that’s all the furnace is classified for, right, ±5? So, if you get a reading of 1000°, it could be 1005° or it could be 995°. Then, you’re putting on top of that the whole idea that your temperature reading has got to be down to 0.1°. There just seems to be some disconnect there.

So, that was the first one. You also mentioned the instrumentation. It’s been pointed out to me, by some of the instrumentation people, that their instruments are actually only reading four digits. So up to 99.9 you actually have a point, but if it goes to 1000°, you’re out of digits; you can’t even read that. I mean, they can’t even read that down to a point.

"So, if you get a reading of 1000°, it could be 1005° or it could be 995°."
Source: Unsplash.com/Getty Images

Andrew Bassett: Correct. On the recording side of things, we went away from analog instrumentation. The old chart papers, that’s all gone, and we required the digital recorders with that 0.1° readability, as of June 30th of this year.

Again, the first draft was all instrumentation. That would be your controllers, your overtemps, and we know that limitation. But everyone does have to be aware of it. We still allow for this calibration of ±2 or 0.2%. If you’re doing a calibration, let’s say, on a temperature control on a calibration point at 1600° and the instrument only reads whole numbers, you can use the percentage, but you would have to round it inward. Let’s use 1800°, that would be an easier way to do it. So, I’m allowed ±2 or 3.6° if I’m using the percentage of reading, but if the instrument does not read in decimal points for a controller or overtemp, you would have to round that down to ±3°.

Doug Glenn: ±3, right; the 0.6° is out the window.

Andrew Bassett: Correct. I shouldn’t say we like to bury things in footnotes, but this was an afterthought. In one of the footnotes, in one of the tables, it talks about instrumentation calibration that people need to be aware of.

Doug Glenn: Let’s just do this because I think we’ve got a good sense of what the situation is, currently. Would you care to prognosticate about the future? Do you think this is going to stand? Do you think it will be changed? What do you think? I realize you’re speaking for yourself, here.

Andrew Bassett: I’m conflicted on both sides. I want to help the supply base with this issue but I’m also on the standards committee that writes the standard. I think because we’re so far down the road, right now — this requirement has been out there since June 2022 — I don’t see anything being rolled back on it, at this point. I think if we did roll it back, we have to look at it both ways.

If we did roll this back and say alright, let’s just do away with this 0.1° readability issue, we still have to worry about the people processing in Celsius. Remember, we’re pretty much the only country in the world that processes in Fahrenheit. The rest of the world has been, probably, following these lines all along. If we rolled this back, just think about all the people that made that investment and moved forward on the 0.1° readability and they come back and say, “Wait a minute. We just spent a $100,000 on upgrading our systems and now you’re rolling it back, that’s not fair to us.”

At this point, with the ball already rolling, it would be very interesting to see when Nadcap starts publishing out the audit findings when it comes to the pyrometry and this 0.1° readability to see how many suppliers are being hit on this requirement and that would give us a good indication. If there are a lot of yeses on it then, obviously, a lot of suppliers haven’t gone down this road. My guess is, for the most part, anybody that’s Nadcap accredited in heat treating — and this goes across chemical processing, coatings, and a few other commodities — I think has caught up to this.

Personally, I don’t think this is going to go away; it’s not going to disappear. It’s going to keep going down this road. Maybe, if people are still struggling with getting the types of devices that can have that 0.1° readability, then maybe another year extension on it, but I don’t know where that is right now. I haven’t gotten enough feedback from aerospace customers that say, "Hey, I can’t get the recorder." I mean,

Doug Glenn: I just don’t understand, Andrew, how it’s even physically possible that companies can record something as accurately as 0.1° if the assembly or thermocouple wire is rated at ±2°? How is that even possible that you can want somebody to be accurate down to ±0.1° when the thing is only accurate up to ±2°?

Andrew Bassett: Right, I get that. We can even go a lot further with that and start talking about budgets of uncertainty. If you look at any reputable thermocouple manufacturer or instrument calibration reports that are ISO 17025, they have to list out their measurements of uncertainty, and that gives you only the 98% competence you’re going to be within that accuracy statement.

Yes, I get the whole issue of this .1° readability. There were good intentions were to fix a flaw, and it spiraled. We’ve seen where PLCs and some of these high logic controllers now can show the .1° readability, but they automatically round up at .5°. Are you now violating the other requirements of rounding to E29? Now, I think we’ve closed out the poll in the standard, but you’re right. We were trying to do the right thing. Personally, I don’t think we gave it all that much further thought on that except hey, let’s just make recorders this way and this should be okay.

Doug Glenn: Right. No, that’s good. Let me be clear, and I think most everybody that was involved with the standards are excellent people and they’re trying to do the right thing. There is no dissing on anybody that was doing it. I’m not a furnace guy, right, I’m a publisher — but when I look at it, I’m going: okay, you’re asking somebody to be as accurate as 0.1° on equipment that can only do ±2°. That’s a 4° swing and you’re asking them to be within 0.1°, basically.

Andrew, this has been helpful. It’s been good hearing from you because you’re on the frontline here. You’ve got one foot firmly planted in both camps.

Andrew Bassett: I’m doing my best to stay neutral with it all.

Doug Glenn: Anyhow, I appreciate it, Andrew. You’re a gentleman. Thanks for taking some time with us.

Andrew Bassett: Thanks, Doug. Appreciate it.


About the expert: Andrew Bassett has more than 25 years of experience in the field of calibrations, temperature uniformity surveys, system accuracy testing, as well an expertise in pressure, humidity, and vacuum measurement calibration. Prior to founding Aerospace Testing & Pyrometry, Andrew previously held positions as Vice President of Pyrometry Services and Director of Pyrometry Services for a large commercial heat treater and Vice President and Quality Control Manager for a small family owned business.

For more information: Andrew Bassett at abassett@atp-cal.com or visit http://www.atp-cal.com/

Doug Glenn at Doug@heattreattoday.com


 

Doug Glenn <br> Publisher <br> Heat Treat Today

Doug Glenn
Publisher
Heat Treat Today


To find other Heat Treat Radio episodes, go to www.heattreattoday.com/radio .


Search heat treat equipment and service providers on Heat Treat Buyers Guide.com


 

Heat Treat Radio #91: Understanding the ±0.1°F Requirement in AMS2750, with Andrew Bassett Read More »

Guide To Conducting SATs According to CQI-9 4th Edition

OCThe AIAG CQI-9 (Heat Treat System Assessment) is the most accepted standard in the automotive industry for the validation of heat treatment operations. This article summarizes the evaluation requirements and illustrates the benefits of conducting this test to identify variations in control systems using the probe method A.

Read the English translation of this Technical Tuesday article by Erika Zarazúa, regional purchasing manager at Global Thermal Solutions, in the version below, or read both the Spanish and the English translation of the article where it was originally published: Heat Treat Today's August 2022 Automotive print edition.

"La evaluación CQI-9 (Heat Treat System Assessment) de AIAG es el estándar mas aceptado en la industria automotriz. . . ."


Erika Zarazúa
Regional Purchasing Manager 
Global Thermal Solutions México
Source: Global Thermal Solutions México

1. Application

System Accuracy Tests (SATs) must be performed on all control, monitoring, and recording systems of thermal processing equipment. This does not apply to “high limit” systems, whose sole function is to protect the furnace from overheating.

Contact us with your Reader Feedback!

The test thermocouple used for the SAT must meet the accuracy requirements defined by CQI-9 in table P3.1.3 (±1.1°C or ±2°F maximum error). Similarly, table P3.2.1 of the same section defines the requirements for the field test instrument (±0.6°C or ±1°F maximum error).

SATs conducted by “probe method” should be performed quarterly or after any maintenance that could affect the accuracy of the measurement system such as:

  • Replacement of lead wire
  • Replacement of the control thermocouple
  • Replacement of the control/recording instrument

2. Procedure (Probe Method A)

Probe method A is a comparison between the furnace temperature reading and a corrected test temperature reading.

Table 1. Probe method A
Tabla 1. Método de sonda A

When inserting the test thermocouple, ensure that the tip of the probe is placed as close as possible to the tip of the thermocouple to be tested, and no further than 50mm. Once placed in the test position, it is recommended to allow some time for both systems to reach equilibrium before conducting the test.

If the difference between the furnace temperature reading and corrected reading of the test system exceeds ±10°F (±5°C), then corrective actions must be conducted before processing a product. The most common corrective actions are to replace the control thermocouple, calibrate and adjust the control/recording instrument, or to combine both methods. According to CQI-9, these actions must be documented.

3. Records

CQI-9 revision 4 specifies that the SAT must be documented, and the records must include, at a minimum, the following information:

a. Furnace thermocouple identification
b. Test thermocouple identification
c. Identification of the test instrument
d. Date and time of the test
e. Setpoint value
f. Reading observed in the control system
g. Observed reading on test system
h. Thermocouple and test instrument correction factors
i. Test system corrected reading
j. Difference calculated from the SAT
k. Name and signature of the technician performing the test
l. Company performing the test (if external)
m. ISO/IEC 17025 accreditation of the company (if external)
n. Approval of the person responsible for heat treatment

4. Conclusion

The pyrometry section of CQI-9 lists the requirements and procedures for conducting system accuracy tests (Section P3.3). Within CQI-9, there are two important requirements heat treaters must be aware of. First, the furnace temperature measurement system must not deviate more than ±10°F (±5°C) from the test system. If this is the case, the equipment must not be used for thermal processing and corrective actions must be taken. Second, the SAT report must contain each time this test is conducted. With probe method A, variations in controls systems are easily identifiable.

 

References

[1] CQI-9 Special Process: Heat Treat System Assessment, 4th Edition. Automotive Industry Action Group, 2020.

[2] International Organization for Standardization; ISO/IEC 17025, General requirements for the competence of testing and calibration laboratories, 3rd Edition. International Organization for Standardization, 2017.

(Photo source: Global Thermal Solutions)

 

About the Author: Erika Zarazúa, a 40 Under 40 Class of 2021 member, is a metallurgical engineer with over 18 years of experience in heat treatment operations and temperature measurement and has worked in multiple engineering, quality, and project roles in the automotive and aerospace industries. Erika currently holds the position of regional purchasing manager at Global Thermal Solutions.

Contact Erika: erika@globalthermalsolutions.com


Find heat treating products and services when you search on Heat Treat Buyers Guide.com


 

Guide To Conducting SATs According to CQI-9 4th Edition Read More »

Guía para conducir pruebas System Accuracy Tests conforme a CQI-9 4ta. Edición

OCThe AIAG CQI-9 (Heat Treat System Assessment) is the most accepted standard in the automotive industry for the validation of heat treatment operations. This article summarizes the evaluation requirements and illustrates the benefits of conducting this test to identify variations in control systems using the probe method A.

Read the Spanish translation of this article by Erika Zarazúa, gerente regional de compras de Global Thermal Solutions México, in the version below, or read both the Spanish and the English translation of the article where it was originally published: Heat Treat Today's August 2022 Automotive print edition.

La evaluación CQI-9 (Heat Treat System Assessment) de AIAG es el estándar mas aceptado en la industria automotriz para la validación de operaciones de tratamiento térmico y, entre muchas cosas, describe los requisitos generales y el procedimiento para conducir las pruebas SAT (System Accuracy Test) a los sistemas medición de temperatura de los equipos de procesamiento térmico. Este artículo sintetiza los requerimientos de la evaluación e ilustra los beneficios de conducir esta prueba para identificar variaciones en los sistemas de control mediante el método de sonda “A”.


Erika Zarazúa
Gerente Regional de Compras 
Global Thermal Solutions México
Source: Global Thermal Solutions México

1. Aplicación

Las pruebas SAT deben realizarse a todos los sistemas de control, monitoreo y registro de los equipos de procesamiento térmico. Esto no aplica para los sistemas de ‘alto-límite” cuya única función es la de proteger al horno de un sobre calentamiento.

Contact us with your Reader Feedback!

El termopar de prueba empleado para la prueba SAT debe cumplir con los requisitos de precisión que define CQI-9 en la tabla P3.1.3 de la sección de Pirometría (±1.1°C o ±2°F máximo de error). De igual manera, la tabla P3.2.1 de la misma sección define los requisitos para el instrumento de prueba - field test instrument (±0.6°C o ±1°F máximo de error).

Las pruebas SAT por el método de sonda deben realizarse trimestralmente o después de algún mantenimiento que pudiera afectar la precisión del sistema de medición como:

  • Reemplazo del cable de extensión
  • Reemplazo del termopar de control
  • Reemplazo del instrumento de control/registro

2. Procedimiento (Método de sonda A)

El método de sonda A es una comparación entre la lectura del sistema de medición del horno y un sistema de medición de prueba corregido:

Table 1. Probe method A
Tabla 1. Método de sonda A

Al insertar el termopar de prueba, se debe asegurar que la punta se coloque lo mas cerca de la punta del termopar a ser probado, y no mas lejos de 50mm. Una vez colocado en la posición de prueba, se recomienda permitir cierto tiempo para que ambos sistemas alcancen un equilibrio antes de conducir la prueba.

Si la diferencia entre el sistema de medición del horno y sistema de prueba corregido excede de ±5°C (±10°F) entonces se deben conducir acciones correctivas antes de procesar producto. Las acciones correctivas mas comunes consisten en reemplazar el termopar de control, calibrar y ajustar el instrumento de control/registro o una combinación de ambas. De acuerdo a CQI-9, estas acciones deben ser documentadas.

3. Registros

CQI-9 revisión 4 especifica que la prueba SAT debe documentarse y los registros deben incluir como mínimo la siguiente información

a. Identificación del termopar del horno
b. Identificación del termopar de prueba
c. Identificación del instrumento de prueba
d. Fecha y hora de la prueba
e. Valor del setpoint
f. Lectura observada en el sistema de control
g. Lectura observada en el sistema de prueba
h. Factores de corrección del termopar e instrumento de prueba
i. Lectura corregida del sistema de prueba
j. Diferencia calculada del SAT
k. Nombre y firma del técnico que realiza la prueba
l. Compañía que realiza la prueba (si es externa)
m. Acreditación en ISO/IEC 17025 de la compañía (si es externa)
n. Aprobación del responsable de tratamiento térmico

4. En resumen

La sección de Pirometría de CQI-9 revisión 4 indica los requerimientos y el procedimiento para la realización de la prueba SAT (Sección P3.3).

El sistema de medición de temperatura del horno no debe presentar una desviación mayor a los ±5°C (±10°F) respecto al sistema de prueba. Si este fuera el caso, el equipo no debe usarse para procesamiento térmico y deben aplicarse acciones correctivas.

CQI-9 especifi ca la información que debe contener el informe de SAT cada vez que se conduce esta prueba.

 

Referencias

[1] Automotive Industry Action Group; CQI-9 Special Process: Heat Treat System Assessment, 4rd Edition, June 2020.

[2] International Organization for Standardization; ISO/IEC 17025, General requirements for the competence of testing and calibration laboratories. 3rd Edition, 2017.

(Fuente de la foto: Global Thermal Solutions)

Sobre el autor: Erika Zarazúa es Ingeniera Química Metalúrgica por parte de la Universidad Autónoma de Querétaro. Con más de 18 años de experiencia en operaciones de tratamiento térmico y medición de temperatura, ha trabajado en múltiples roles de ingeniería, calidad y proyectos en las industrias automotriz y aeroespacial. Actualmente ocupa el cargo de Gerente Regional de Compras de Global Thermal Solutions.

Contacto Erika: erika@globalthermalsolutions.com


Find heat treating products and services when you search on Heat Treat Buyers Guide.com


 

Guía para conducir pruebas System Accuracy Tests conforme a CQI-9 4ta. Edición Read More »

Skip to content