Dr. Steve Offley

Heat Treat Radio #131: Beyond Calibration — Real-World Accuracy in Heat Treat Measurement


What does it really take to achieve accurate temperature measurement in the real heat treat production? In this episode of Heat Treat Radio, host Heather Falcone sits down with Dr. Steve Offley, product marketing manager at PhoenixTM, to explore the science behind thru-process monitoring, thermal barriers, and data logger performance. From cold junction compensation to real-world shop floor challenges, they unpack why lab accuracy doesn’t always translate to production — and what heat treaters can do about it. Tune in to learn how to ensure your temperature data is as reliable as the parts you produce.

Below, you can watch the video, listen to the podcast by clicking on the audio play button, or read an edited transcript.




The following transcript has been edited for your reading enjoyment.

Introduction (00:04)

Heather Falcone: Today we are talking about a feature article coming up in this month’s magazine, Achieving Accurate Measurements in Real Heat Treat Production. Joining me today is author of this piece, Dr. Steve Offley from PhoenixTM, who is also our sponsor for today’s episode.

Steve is a product marketing manager at PhoenixTM with responsibility for both strategic product management and global marketing of the company’s thru-process temperature, optical profiling, and TUS system product range. Steve joined PhoenixTM in April of 2018 after 22 years of experience in the industrial temperature profiling market with another well-known company.

The Role of PhoenixTM (2:35)

Heather Falcone: Tell us about your role at PhoenixTM and the role of PhoenixTM, specifically how they provide solutions for the thermal processors out there.

Steve Offley: I am the product marketing manager or product manager for the range of temperature monitoring systems that we offer to the wider industrial space. We provide support for clients in a range of industries who are faced with the daily challenges of using thermal processing as part of their key manufacturing step. We offer unique solutions for those specific applications, because not every application is the same. Our goal is to allow the customer to monitor the temperature of their specific product in some form of heat treatment process.

For instance, we could be offering a solution for the coating market, where a client wants to monitor the thermal cure of a car body. They want to ensure that that car body, as it travels through the curing oven, is achieving the correct temperature, not just in the oven itself, but at the product level. So is each part of that car body achieving the right temperature for the right duration to cure the paint?

Another day we might be dealing with a food processor who, as you can imagine, when they’re dealing with food safety and HACCP requirements, they want to prove that the core of their product, which may be a chicken fillet in a deep fat fryer, is achieving the right temperature, to make sure that it’s safe and it’s an attractive product to eat. And of course, they want to be confident that the consumer is going to be healthy after consuming the product too.

In the buildings and ceramics industry, for instance, we can offer the same sort of solution for the manufacturers of bricks, building materials, tiles, etc., where the process may actually be up to three or four days long where they’re drying the products. But it’s still critical to know what the temperature is at the product level.

Much of our focus is on heat treatment of metals. We are trying to provide different solutions across the whole gamut of the heat treating industry — from primary production, such as slab heat treatment for steel and for aluminum, proving that the raw material has been processed correctly in the furnace, to the finished product.

We are talking about the formed metal product, making sure that that is achieving the right primary metallurgical properties. It needs to do the function it’s going to be used for, from a temperature profiling perspective and also possibly even a temperature uniformity survey (TUS), which is obviously critical in many of the automotive and aerospace sectors of the market where they’re trying to prove or validate the furnace performance.

What is Thru-Process Monitoring? (6:08)

Heather Falcone: Can you explain what thru-process is and how it influences the monitoring technology that you’re talking about?

Click the image above for an introduction to the thru-process concept and data logger basics.

Steve Offley: “Thru-processes” is the term that is key to the type of solution we are trying to provide. When we’re talking about heat treatment, there are still many applications where the product may be heat treated in a static box furnace, in which case the traditional technology of using trained thermocouples is probably as easy as any other, whereby you have your field test instrument external to the furnace chamber. The thermocouples are then moved into — or traced into — the furnace, attached either to the product or, if you’re doing a thermal uniformity survey, to a test frame to locate the thermocouples at the desired coordinates within the working zone of the furnace. Then you are collecting the data externally.

I believe it was two episodes ago that you had Dennis from ECM talking about modular heat treatment. He was talking about the challenges of or the increased level of technology associated with moving batches of products around the heat treating cycle in a modular approach.

When you have that type of setup, and even in situations where you may be heat treating in a continuous furnace, the use of a trailing thermocouple becomes difficult at best, impractical and problematic in terms of safety at worst. For the modular approach, you have thermocouples going into different chambers and moving around. There are seals and automated doors in which the thermocouples will be trapped. As such, it’s very difficult to actually monitor the whole sequence of events that may be occurring in the heat treatment sequence.

Traditional vs. Thru-Process Monitoring

This brings us back to the thru-process methodology. At Phoenix, we offer a system that is designed to travel as if it was part of the product basket through the process. The field test instrument, the data logger, has to travel with the product through the furnace.

A data logger in its own right is not capable of going through a furnace if you are measuring at 800°F to 1000°F. One of the key aspects of our system solution is what we refer to as the thermal barrier. It is an enclosure that is used to protect the data logger to allow it to travel through the process. Essentially you encase the data logger inside the barrier and then place it on the conveyor or in the product basket with short thermocouples that are then rooted to the product or to the test frame that’s being moved with the whole monitoring system through the process.

The Importance of Thermal Barrier Design (9:38)

Heather Falcone: The thermal barrier design is really important then, because you’re going to see a variety of environments. How do you protect the data logger?

Steve Offley: That’s the crux of the technology that we’re trying to provide, in so much that there are many different forms of heat treatment or many different forms of thermal processing where we’re trying to provide the protection we need.

You may have, for instance, a low pressure carburizing process where you’re putting the system into a vacuum furnace, and then you may have a high pressure quench at the end. You have to protect the logger and not just from the temperature criteria inside the furnace, but certain things like pressure changes, which can distort the equipment. That is one design barrier, which would give additional protection to prevent any distortion or compressional damage to the barrier.

Click on the image above to find a real-world companion to Dr. Offley’s barrier design examples, covering oil and water quench protection in practice.

There may be some circumstances, like with a T6 aluminum process, where you have sent the system through the furnace, you then got a water quench, and now in the thru-process principle, the equipment has to go through all aspects of the process. You therefore have to have a design in which the system can tolerate both the heating process, but also the rapid cooling going into the water.

You may also have a situation where you have an Endothermic carburizing furnace with an integrated oil quench. The same approach applies. You are going from a hot environment and then rapid cooling. You are not only protecting the logger from the damage of the heat, but also the materials, like the oil or water in the water sequential and oil quench scenario, so as to not damage the fairly sophisticated electronics of the data log.

There is a lot of science and technology involved in designing unique solutions to meet the specific requirements of the applications. In most cases, we are working with the client from their working spec to develop unique solutions that will meet their unique requirements.

Protection and Accuracy (13:36)

Heather Falcone: When we’re talking about actually monitoring the surveys, what special measures are you required to design the data loggers with that provide accuracy?  

Steve Offley: By the very nature that we are sending the data logger through the furnace, we have to be careful that we are not only protecting the data logger from physical damage, which is possible if we do not get the thermal barrier design correct. But we also want, at the end of the day, to guarantee that we are achieving the accurate data that we need to make sense of the profile information that we are getting. Because at the end of the process, you either have the thermal fingerprint of your process or if you’re doing temperature uniformity survey, you have the readability of the data at the respective test levels. According to the standard CQI-9 and AMS2750, the accuracy of the reading or the field test instrument has to be within ±1°F.

The purpose of the barrier is to not only protect the data logger from damage, but keep the data logger at a working temperature that allows the accuracy of the reading that conforms to the standards that you are working to.

There are different designs of thermal barriers that we can offer. The basic design is what we refer to as the microporous insulation technology. This is basically a dry barrier whereby the insulation slows down the penetration of the heat to the core of the barrier where the data logger is. But at the center of that barrier, there will be a device that we refer to as a heat sink. There’s a eutectic salt inside the heat sink, which will transfer its physical state from a solid to a liquid at a nominal temperature. It’s 58°C where the transfer occurs and that will maintain the temperature at that working temperature.

For longer processes, you may want to use a thermal barrier that uses what we call a phased evaporation protection methodology. In simple terms, it involves the use of water, which is able to absorb very large amounts of energy and heat, and obviously will boil at 212°F (100°C). While it’s maintaining that boiling state, it will maintain the temperature of the thermal barrier and the data logger inside it. So we can actually offer a high temperature data logger that is capable of operating safely at 212°F for long periods time and still be protected.

Thermocouple Use (17:00)

Steve Offley: As long as the barrier provides us with that thermal protection and the logger is working within its operating range, we are fairly safe. That being said, we have to be a little bit careful when we consider the technology of the thermocouple, because there’s some fairly serious restrictions on thermocouple use, which many people may or may not be aware of.

Many people know that the thermocouple technology was developed by Dr. Seebeck back in about 1821. He was a German physicist who discovered the fact that if you had two dissimilar metals connected at a junction or a point, at a particular temperature, those two dissimilar metals would create a millivolt reading, and that millivolt reading would be proportional to the actual temperature that those two dissimilar metals were experiencing. Hence the theory of the thermocouple.

Dr. Steve Offley showing the aluminal and the chromal leg of a type K thermocouple.

Most people are fully aware of what a thermocouple looks like, but it’s important to note that this is a type of thermocouple we’d use for a coating application. It has a PFA-insulated sleeving on it. You would not use this in many heat treatment applications, but what I want to do is to show you that in the core of the thermocouple, there are two wires, two dissimilar metals. This is a type K thermocouple. We have the aluminal and the chromal leg of the thermocouples. These are the unique materials that are used to generate the millivolt reading.

The way the thermocouple works then is that that millivolt can be cross-referenced to a calibration table or a voltage table to determine the temperature reading that the sensor. This is what we refer to as the hot junction, the very tip of the thermocouple. It’s critical that that point is where you want the measurement to be made.

What is often missed is the fact that with a thermocouple, although the hot junction is critical, there is another junction that is even possibly more critical and sometimes overlooked — the cold junction. The thermocouple does not actually record an absolute reading, it’s a ratio between the hot junction and the cold junction. The cold junction of a thermocouple is where the actual thermocouple materials, the two dissimilar metals, join what we refer to as the copper connection. This tends to be where, in the data logger or the field test instrument, the electronics make the physical measurement or process the actual reading from the thermocouple.

If you have a fixed data logger like we have at Phoenix, whereby you would designate the type of thermocouple you were plugging into the data logger — this is a data logger with 20 channels and it’s a type K — I would plug my thermocouple simply into that connector. The cold junction is not in this case at the point where I’m making the connection on the data logger. It is actually inside the logger because there is another wire that goes from the socket to the PCB board where the measurement is actually taken.

Inside the data logger, there is a connector block where the thermocouple wires from the thermocouple sockets will all join the PCB board where the measurement is taken. That’s the location where the cold junction measurement is taken. So, we have our hot junction at the end of the thermocouple, and we have our cold junction inside the data logger.

A side-by-side comparison of the two critical measurement points in any thermocouple circuit: hot junction vs. cold junction.

For some data loggers, that connector or that coal junction may actually be on the outside of the data logger, if it’s a universal connector. So it’s important that you understand where that cold junction is in-situ within your technology. The importance of the reading is the fact that you have a ratio between the hot junction where you are measuring the product and the cold junction where that physical measurement is being referenced inside the data logger.

You can imagine, therefore, if the data logger temperature changes, that change in data logger temperature can actually affect the reading you are taking inside your process. That is why it is important to either understand that and make changes so it’s prevented or do what we refer to as cold junction compensation.

Cold Junction Compensation (22:35)

Steve Offley: Inside the data logger, if you are going to compensate for that temperature difference, the data logger is protected up to a physical temperature. But the temperature is going to change. So that cold junction is going to change as it travels through the processing in the way that we do our measurements for thru-process monitoring. The logger will rise in temperature. Therefore, we have to compensate for that.

In the center of the data logger where the connection is made with the copper from the thermocouple cable to the copper-copper connection, we have a temperature sensor, a thermistor, which is accurate to 0.18°F. It measures the actual cold junction temperature of the logger, and it will then compensate automatically for that. Therefore, you can guarantee that even when your data logger temperature changes temperature, it’s compensating for that. There will be no drift in the measurement temperature that you are measuring at the hot junction at the product level.

Data logger temperature change over process time, with and without cold junction compensation, measuring a stable process temperature of 1470°F | Image Credit: PhoenixTM, taken from Achieving Accurate Measurements in Real Heat Treat Production in the March print edition of Heat Treat Today.

Heather Falcone: That was the first time that I had read about the cold junction compensation and why it’s so critical, especially when we’re doing TUS activities.

Steve Offley: With TUS, the accuracy of both the data or the field testing instrument, or the data logger and the thermocouple, are critical to the quality of the test data that you are collecting and obviously trying to comply with the very stringent requirements of the AMS and the CQI-9 standards.

In our case, where we are going through the furnace, we have a worst-case scenario because the data logger is naturally going to change in temperature. But even if we take the scenario to the shop floor, and we are doing an external temperature uniformity survey, the data logger that is sitting outside the furnace, cold junction compensation is still critical for that because within a working day, the floor temperature is generally going to be changing.

Events on the shop floor, like opening the furnace activity on the shop floor, are going to change that temperature. I’ve been in many plants where seasonal changes can make a significant difference to the temperature into which you are taking the temperature. It won’t be the first time I’m sure that people have taken equipment out a car after having traveled for many miles in the early hours of the morning only to realize that the data logger temperature may not be at room temperature. You have to be very careful that you have a stable piece of equipment and that the cold junction is working correctly.

It’s important to read the user manual because there’s often a very critical step to make sure that you are either calibrating the equipment in a real-life environment where the temperature change may be, or ensuring sure that your system has got cold junction compensation. Otherwise, what you believe is a true measurement and accurate, may be the calibration laboratory accuracy where the temperature is controlled very, very strictly. In a real life situation, you may not be seeing exactly the same results.

Heather Falcone: It is really important to consider because there are specific accuracy considerations for AMS2750 and CQI-9.

Steve Offley: I often make an analogy to racing. The Formula One racing cars are tuned up to perform highly on a racetrack environment. They can do 200 miles an hour, having been finely tuned, and they work well. If you take that same racing car off the road into the countryside, it is not going to be working quite as effectively.

The same can be said for data logger technology. A data logger that works well in a calibration laboratory and under fairly safe conditions may or may not be working as effectively on the shop floor, particularly when you consider the variation and the challenges that that environment will bring to a measuring system like this.

Linear Interpolation Correction Factor Method (27:22)

Heather Falcone: Can you explain how you use the linear interpolation correction factor method, because that’s one of the only that is allowed by AMS2750, and why it is beneficial to your data quality?

Steve Offley: We discussed the nominal requirements for the data log accuracy for its measurement, but for AMS2750, logger correction factors and also thermocouple correction factors can be applied to the test data that you are collecting with your monitoring system.

Firstly, for the data logger, we can create a data logger correction factor file, which basically shows the correction factors that need to be applied to each of the separate channels of the data logger for the data that you are collecting. Inside the data logger, we store the calibration information that was gleaned in the calibration laboratory. That can then generate an automatic calibration template, which can be automatically applied to each one of the channels on the data logger automatically as part of the test routine.

The last thing we want to do is to make some error by transferring raw data manually from a spreadsheet into a piece of software. So, the nice thing about that is that it’s automatically applying the pre-programmed offsets from the calibration routine in the laboratory itself.

Secondly, with the thermocouple, we can take a calibrated thermocouple where there will be a nominal reading at two ends of the thermocouple, and you then get the average correction factor. In some circumstances, people will apply a thermocouple correction factor of one nominal temperature below the test level that they are applying. At Phoenix, we calibrate the thermocouples across the complete temperature range of the data logger. Then, we apply what we call the linear interpretation method. What that means is that between each calibration point, we can calculate, using a linear regression line, the true correction factor at any temperature over the measurement range of the device itself.

The linear interpolation schematic showing thermocouple correction factors across the full calibration range | Image credit: PhoenixTM, taken from Achieving Accurate Measurements in Real Heat Treat Production in the March print edition of Heat Treat Today.

It cannot go beyond the bounds of the upper and lower limit, as extrapolation is not allowed as it says in the standard. But within the upper and lower bounds, we can interpolate linearly between each data point. There is a tight 140° between each point that we can then ensure that we are correcting for or playing the correct correction factor at each temperature from start to finish, not a nominal value over the whole range. In our view, that gives a far more accurate interpretation of the corrected data over the complete working range of the system as opposed to a single nominal value.

Final Thoughts (31:09)

Heather Falcone: We have talked about a variety of topics: thru-processing monitoring, thermal protection at the data logger, the benefits of making sure that you apply cold junction correction, and the specific accuracy considerations that we have to make sure we bundle in all together. What is the big takeaway you want to leave us with?

Steve Offley: Be careful you do not assume that the condition of operation in the calibration of laboratory is going to be reproduced on the shop floor because the conditions are very different. This comes back to the argument for the importance of cold junction compensation. If you are using technology or a data logger, check with the manual for what cold junction compensation should be applied and if there are any steps you need to make to ensure that that is applied correctly on the shop floor. If you do not, there is a high risk that what you think is accurate data may or may not be if you have a situation where your data logger temperature is varying with time, either in process or even on the shop floor with changing environmental conditions.

Heather Falcone: In the end, we want to make sure that we’re making good parts, and this sounds like a great system to make sure that you’re getting as accurate as possible.

Steve Offley: Quality data at the end of the day is essential for you to understand what your process is doing. It’s no good relying on data you cannot trust. Take that extra time to investigate and put steps in place to make sure that you are measuring what you think you are measuring at the hot junction and that the cold junction is being considered as part of that measurement process.​


About the Guest

Dr. Steve Offley
Product Marketing Manager
PhoenixTM Ltd.

Dr. Steve Offley, aka “Dr.O” is a product marketing manager with PhoenixTM Ltd. with 30 years of experience of temperature monitoring in the industrial thermal processing market.

For more information: Contact Steve Offley at steve.offley@phoenixtm.com.

Heat Treat Radio #131: Beyond Calibration — Real-World Accuracy in Heat Treat Measurement Read More »

Achieving Accurate Measurements in Real Heat Treat Production

Following tight standards might not mean your heat treat process is truly accurate…if your instrumentation does not see the full picture. In this Technical Tuesday installment, Dr. Steve Offley, product marketing manager with PhoenixTM Ltd., discusses how combining accurate data loggers, high-quality thermocouples, and linear interpolation of correction factors ensures consistent compliance with AMS2750H and delivers trustworthy survey results. The article further explores how thermocouple behavior and real-world processing conditions necessitate careful attention to each thermocouple junction.

This informative piece was first released in Heat Treat Today’s March 2026 Annual Aerospace Heat Treating print edition.


Introduction

In the world of heat treatment, temperature measurement accuracy is critical, whether performing process monitoring or temperature uniformity surveys (TUS) as part of AMS2750. Measurement accuracy is defined as the degree to which the result of a measurement, calculation, or specification conforms to the correct value or standard. Without confidence in the accuracy of your measurement, you are working in the dark and could be deceiving yourself and possibly others.

The requirement of ±0.1°F readability first referenced in the F revision of AMS2750 pyrometry standard has garnered much debate and critical discussions throughout the years. Although helpful in resolving confusion in what option to record and removing discrepancies in recording temperatures in metric versus Imperial units, this solution does not necessarily guarantee instrumentation accuracy working in a real-world heat treat operation settings.

The following article introduces important considerations to be made when performing the temperature monitoring operation with reference to measurement accuracy in a real-world test environment — on a shop floor, not just a stable controlled calibration laboratory.

Monitoring & TUS Methodology

Traditionally, TUS are performed using a field test instrument, which in most situations will be a temperature data logger. For static batch ovens, a static data logger is positioned externally to the furnace. Long thermocouples are trailed into the furnace heating chamber connected directly to the TUS frame.

Figure 1. Typical TUS setup for a static batch furnace. Twenty channel external data loggers connected directly to a nine-point TUS frame used to measure the temperature uniformity over the volumetric working volume of the furnace. | Image Credit: PhoenixTM
Figure 2. Thru-process TUS monitoring system. The data logger shown is located inside the thermal barrier, which travels with the TUS frame through the furnace. | Image Credit: PhoenixTM

For continuous or semi-continuous modular processes, this trailing thermocouple method is difficult if not impossible. For these furnaces, the preferred method of temperature monitoring is “thru-process”: the data logger is connected to a TUS frame, traveling with the load through the furnace. To protect the data logger from the hostile process conditions (including heat, pressure, steam, water, salt, or oil) the data logger is encased in a thermal barrier designed for the process in hand (Figure 2).

Calibration Accuracy Requirements

In either static or continuous processing conditions, the accuracy of the temperature monitoring system is dependent upon the combined accuracy of the field test instrument (data logger) and the temperature sensor thermocouples.

Both aspects of the monitoring system must be strictly controlled for AMS2750H compliance. The field test instrument data logger needs to have a calibration accuracy of ±1.0°F or ±0.1% of temperature reading, whichever is greater (Table 7 in AMS2750H), and a readability of ±0.1°F. The most common base metal thermocouples (K and N) used will themselves need to have a calibration accuracy of ±2.0°F or ±0.4% (percent of reading or correction factor °F, whichever is greater) as defined in Table 1 of the AMS2750H specification.

Field Measurement Accuracy

For process monitoring, thermocouples are generally the preferred temperature sensor when considering accuracy, robust operation, cost, and availability. It is important to fully understand the working limitations of the sensor technology from measurement accuracy attainable on the heat treat shop floor to ensure they are compensated for.

The theory of the thermocouple is traced back to a German Physicist, Thomas Seebeck in 1821. The “Seebeck effect” describes the generation of electrical voltage when a temperature difference exists between two dissimilar electrical conductors (metals). The resulting milivoltage (mV) is defined by actual temperature experienced at the measurement junction. For any given thermocouple, the measured mV can be converted to a temperature using standardized Seebeck voltage curves, commonly documented in thermocouple reference tables.

Figure 3. Basic thermocouple measurement circuit showing critical hot and cold junctions. Image Credit: PhoenixTM

Using the Seebeck principle, the thermocouple consists of two wires of dissimilar metals joined at the measurement point, known as the hot junction. The output voltage from the sensor is proportional to the temperature difference between the hot junction and the point of voltage measurement, known as the cold junction. It is important to recognize that a thermocouple measures temperature difference, not an absolute temperature. The basic principle of how a thermocouple measurement circuit operates is shown in Figure 3.

Critical Cold Junction Measurement

A common misconception is that thermocouple accuracy only needs to be accounted for at the hot junction. As previously mentioned, the thermocouple measurement is reliant on the temperature reading at the hot junction offset against the temperature of the cold junction. From an electronics level, the cold junction is where the thermocouple wires connect to the copper/copper connection on the electronic circuit. The cold junction therefore may be inside the data logger or on the outside of the data logger, if universal thermocouple connectors are used (Cu sockets).

Therefore, to get a consistent accurate reading from the hot junction, it is important to consistently and accurately monitor the cold junction temperature so the measurement can be corrected using a method known as “cold junction compensation.” It is critical that the cold junction temperature sensor is located correctly to ensure that the true cold junction temperature is measured and applied.

Essential Accuracy in Real World

While the accuracy of many data loggers may appear to be acceptable on paper, this may not reflect the real world situation. Data logger temperature may not be stable, which can compromise temperature accuracy when proper cold junction compensation is not implemented. The calibration accuracy in a stable temperature-controlled laboratory, or while performing an in-situ calibration, is one thing, but is the field test instrument able to work accurately on the production floor with significant swings in temperature over the survey period? Do you know what temperature changes the data logger may be experiencing on your process floor (e.g., climatic variation during day/furnace heat up, loading and unloading actions)?

Remember, only a few degree change in the cold junction temperature may compromise the measurement accuracy enough to fail the TUS level being tested, if no compensation is undertaken or if the compensation temperature used does not accurately reflect the live cold junction temperature.

Cold Junction Compensation Logger Data

Data loggers designed with an essential accurate cold junction compensation technology, like those created by PhoenixTM, maintain measurement accuracy in every changing industrial environments. This design allows the data logger accuracy to be quoted at ±0.5°F (K and N) over the full operating temperature range of the PhoenixTM data logger family. For standard data loggers used in conventional thermal barriers (phase change heat sink), the accuracy is maintained over the operating range of 32°F to 176°F. For high temperature data loggers used in phased evaporation thermal barriers (water tank protection), the accuracy is provided over the operating range of 32°F to 230°F. As designed, the data logger will operate at 212°F (boiling water), so cold junction compensation is critical with the data logger ambient temperature changing from 70°F to 212°F during normal operation.

Take for example an external data logger with cold compensation technology with an operating temperature range of 32°F to 131°F (see PTM4220 in Figure 1). On a production floor, users can safely operate, relying on the cold junction compensation to address temperature fluctuations in the processing environment.

Figure 4. Effect of changing physical data logger temperature on the thermocouple measurement with and without cold junction compensation measure a stable process temperature of 1470°F. Image Credit: PhoenixTM

Additionally, the thermocouple socket in the data logger case is connected directly to the measurement board of the data logger using thermocouple wire of the designated type (e.g., type K). A thermistor temperature sensor accurately monitors the connector temperature (±0.18°F) providing an accurate record of the cold junction. The connector is located inside the data logger cavity, protected from rapid environmental temperature changes, and is compact and isothermal. As such, the thermistor temperature accurately reflects the cold junction of each unique thermocouple connection. This temperature provides an accurate cold junction temperature compensation to maintain measurement accuracy with any internal data logger temperature variation (Figure 4).

Thermocouple Accuracy

Figure 5. Nonexpendable mineral insulated thermocouple type K (0.06 inch) or N (0.08 inch). UHT alloy sheathed insulated hot junction, terminating in miniature plug. Maximum temperature 2192°F ANSI MC96.1 Special Limits (±2.0 °F or ± 0.4%, whichever is highest). Image Credit: PhoenixTM

To maximize measurement accuracy, it is important that thermocouples are selected with the highest accuracy and manufactured to resist damage from thermal cycling at elevated temperatures.

For thru-process monitoring, short thermocouple lengths are required to connect the data logger within the thermal barrier and the TUS frame. As such, nonexpendable (see AMS2750H 2.2.36, Table 3) thermocouples can be employed with ease. Robust mineral insulated thermocouples (MIMS) (Figure 5), typically type K or N, can be permanently fixed to the TUS frame. This both reduces setup time and guarantees that thermocouple positions are consistent for periodic TUS work as defined (see AMS2750H 3.1.7, Table 5).

Barring physical damage, the mineral insulated thermocouples can be used unrestricted for up to three months (type K) and three months or longer (type N) if recalibration is successful at the three-month check.

Data Logger and Thermocouple Correction Factors

The PhoenixTM system allows both data logger and thermocouple correction factors to be applied automatically to the raw survey temperature data, maximizing measurement accuracy. The data logger correction factors can be read directly from the onboard digital data logger calibration file. Thermocouples are available with comprehensive calibration certificates providing corrections factors at multiple set temperatures across the required measurement range.

Figure 6. Schematic of the linear interpolation method (AMS2750 accepted) of calculating thermocouple correction factors over the entire calibration range of the thermocouple. Every TUS measurement in therefore corrected accurately against matching calibration offset data. Image Credit: PhoenixTM

For both data logger and thermocouples, correction factors are interpolated across the complete calibration range using the linear method as permitted by AMS2750H 3.1.4.8 (Figure 6). This approach means that the accuracy of the entire TUS dataset is guaranteed compared to applying a single correction factor calculated at a single nominated temperature, which may not truly reflect the complete temperature range.

Summary

To guarantee the accuracy of both temperature profile and TUS data, it is important that the field test instrument data logger not only provides the desired calibration accuracy but is able to work accurately in a production environment. For thermocouple systems, accurate cold junction compensation offers critical peace of mind to correct for changes in the operating temperature characteristics of the data logger during use.

Data logger and thermocouple correction factors should be implemented to maximize measurement accuracy. As discussed, the use of linear interpolation method ensures that correction factors calculated over the entire measurement range are implemented providing full data accuracy.

About The Author:

Dr. Steve Offley
Product Marketing Manager
PhoenixTM Ltd.

Dr. Steve Offley, “aka Dr O” is a product marketing manager with PhoenixTM Ltd. with 30 years of experience of temperature monitoring in the industrial thermal processing market.

For more information: Contact Steve Offley at steve.offley@phoenixtm.com.

Achieving Accurate Measurements in Real Heat Treat Production Read More »

Overcoming Quality Challenges for Automotive T6 Heat Treating

Three elements in the T6 aluminum heat treatment process — high temperature solution heat treatment, drastic temperature change in the water quench, and a long age hardening process — challenge accurate temperature monitoring. Thru-process technology gives in-house heat treaters the power to control these variables to overcome the unknowns. In the following Technical Tuesday article, Dr. Steve Offley, “Dr. O”, product marketing manager at PhoenixTM, examines the path forward through the challenges of aluminum heat treating.


Aluminum Processing Growth

In today’s automotive and general manufacturing markets, aluminum is increasingly becoming the material of choice, being lighter, safer, and more sustainable. Manufacturers looking to replace existing materials with aluminum are needing new methodology to prove that thermal processing of aluminum parts and products is done to specification, efficiently and economically.

To add strength to pure aluminum, alloys are developed by the addition of elements dissolved into solid solutions employing the T6 heat treatment process (Figure 1). The alloy atoms create obstacles to dislocate movement of aluminum atoms through the aluminum matrix. This gives more structural integrity and strength.

FIgure 1. Critical temperature phase transitions of the T6 aluminum heat treatment process
Source: PhoenixTM

Process temperature control and uniformity is critical to the success of T6 heat treat to maximize the solubility of hardening solutes such as copper, magnesium, silicon, and zinc without exceeding the eutectic melting temperature. With a temperature difference of typically 9–15°F, knowing the accurate temperature of the product is essential. Control of the later quench process (Figure 1, Phase 3) is also critical not only to facilitate the alloy element precipitation phase but also to prevent unwanted part distortion/warping and risk of quench cracking.

T6 Process Monitoring Challenges

The T6 solution reheat process comes with many technical challenges where temperature profiling is concerned. The need to monitor all three of the equally important phases — solution treatment, quench, and the age hardening process — makes the trailing thermocouple methodology impossible.

Figure 2. Thru-process temperature monitoring of the three T6 heat treatment phases
Source: PhoenixTM

Even when considering applying thru-process temperature profiling technology, sending the data logger through the process, protected in a thermal barrier (Figure 2), the T6 heat treat process comes with significant challenges. A system will not only need to protect against heat (up to 1020°F) over a long process duration but also withstand the rigors of being plunged into a water quench. Rapid temperature transitions create elevated risk of distortion and warping which need to be addressed to give a reliable and robust monitoring solution.

Certain monitoring systems can provide protection to the data logger at 1022°F for up to 20 hours (Figure 3).

Figure 3. Thru-process temperature profiling system installed in the product cage monitoring the T6 heat treatment (solution treatment, quench, and age hardening) of aluminum engine blocks

Thermal Protection Technology

To meet the challenges of the T6 heat treat process, the conventional thermal barrier design employing microporous insulation is replaced with a water tank design, with thermal protection using an evaporative phase change temperature control principle. Evaporative technology uses boiling water to keep the high temperature data logger (maximum operating temperature of 230°F) at a stable operating temperature of 212°F as the water changes phase from liquid to steam. The advantage of evaporative technology is that a physically smaller barrier is often possible. It is estimated that with a like for like size (volume) and weight, an evaporative barrier will provide in the region of twice the thermal protection of a standard thermal barrier with microporous insulation and heat sink. The level of thermal protection can be adjusted by changing the capacity of the water tank and the volume of water. Increasing the volume of water increases the duration at which the T6 temperature barrier will maintain the data logger temperature of 212°F before it is depleted by evaporation losses.

The TS06 thermal barrier design (Figure 4) incorporates a further level of protection with an outer layer of insulation blanket contained within a structural outer metal cage. The key role of this material is to act as an insulative layer around the water tank to reduce the risk of structural distortion from rapid temperature changes both positive and negative in the T6 process.

Figure 4. TS06 thermal barrier design showing water tank, housing the data logger at its core, installed within structural frame containing the insulation blanket surface layer; water tank shown with traditional compression fitting face plate seal
Source: PhoenixTM

Obviously, the evaporative loss rate of water is governed by the water tank geometry. A cube shaped tank will provide the best performance, but this may need to be adapted to meet process height restrictions. A TS06 thermal barrier with dimensions 8.5 x 18.6 x 25.2 inches (H x W x L) offering a water capacity of 3.5 US gallons provides 11 hours of protection at 1022°F. A larger TS06 with approximately twice the capacity 12.2 x 18.6 x 25.2 inches (H x W x L) and 7.7 US gallons gives approximately twice the protection (20 hours at 1022°F).

Innovative IP67 Sealing Design

Passing through the water quench, the data logger needs to be protected from water damage. This is achieved in the system design by combining a fully IP67 sealed data logger case and water tank front face plate through which the thermocouples exit. Traditionally in heat treatment applications, mineral insulated thermocouples are sealed using robust metal compression fittings. Although reliable, the compression seals are difficult to use, requiring long set-up times. The whole uncoiled straight cable length must be passed through the tight fitting which, for the 10 x 13 ft thermocouples, takes some patience. Thermocouples can be used and installed for multiple runs, if undamaged. Unfortunately, as the ferrule in the compression fitting bites into the MI cable, removal of the cable requires the thermocouple to be cut, preventing reuse.

To overcome the frustrations of compression fitting, an alternative innovative thermocouple sealing mechanism has been designed for use on the T6 thermal barrier (Figure 5).

Figure 5. TS06 thermal barrier IP67 bi-directional rubber gasket seal; installation of mineral-insulated (MI) thermocouples and RF antenna aerial

Thermocouples can be slotted easily and quickly, tool free, into a precision cut rubber gasket without any need to uncoil the thermocouple completely. The rubber gasket has a unique bi-directional seal, allowing both sealing of each thermocouple but also sealing of the clamp face plate to the data logger tray, which is then secured to the water tank with a further silicone gasket seal. The new seal design allows thermocouples to be uninstalled and reused, reducing operating costs significantly.

Accurate Process Data considerations

The T6 applications come with a series of monitoring challenges which need to be considered carefully to guarantee the quality of the data obtained. Although the complete process time of the three phases can reach up to 10 hours, it is necessary to use a rapid sample interval (seconds) to provide a sufficient resolution. The data logger is designed to facilitate this with a minimum sample interval of 0.2 seconds over 20 channels and memory size of 3.8 million data points, allowing complete monitoring of the entire process. A sample interval of 0.2 seconds provides sufficient data points on the rapid quench cooling curve. The high resolution allows full analysis and optimization of the quench rate to achieve required metallurgical transitions yet avoid distortion or quench cracking risks.

Employing the phased evaporation thermal barrier design, the high temperature data logger with maximum operating temperature of 230°F will operate safely at 212°F. During the profile run, the data logger internal temperature will increase from ambient temperature to 212°F. To allow the thermocouple to accurately record temperature, the data logger offers a sophisticated cold junction compensation method, correcting the thermocouple read out (hot junction) for anticipated internal data logger temperature changes.

Data logger and thermocouple calibration data covering the complete measurement range (not just a single designated temperature) can be used to create detailed correction factor files. Correction factors are calculated by interpolation between two known calibration points using the linear method as approved by CQI-9 and AMS2750G. This method ensures that all profile data is corrected to the highest possible accuracy. 

Addressing Real-Time, Thru-Process Temperature Monitoring Challenges

For a process time as long as the T6, real-time monitoring capability is a significant benefit. The unique two-way RF telemetry system used on the PhoenixTM system helps address the technical challenges of the three separate stages of the process. The RF signal can be transmitted from the data logger through a series of routers linked back to the main coordinator connected to the monitoring PC. The wirelessly connected routers are located at convenient points in the process (solution treatment furnace, quench tank, aging furnace) to capture all live data without any inconvenience of routing communication cables.

A major challenge in the T6 process is the quench step from an RF telemetry perspective. An RF signal cannot escape from water in the quench tank. To overcome this limitation, a “catch up” feature is implemented. Once the system exits the quench and the RF signal is re-established, any previously missing data is retransmitted guaranteeing full process coverage.

Process Quality Assurance and Validation

In the automotive industry, many operations will be working to the CQI-9 special process heat treat system assessment accreditation. As defined by the pyrometry standard, operators need to validate the accuracy and uniformity of the furnace work zone by employing a temperature uniformity survey (TUS).

The thru-process monitoring principle allows for an efficient method by which the TUS can be performed employing a TUS frame to position a defined number of thermocouples over the specific working zone of the furnace (product basket). As defined in the standard with particular reference to application assessment process Table C (aluminum heat treating), the uniformity for both the solution heat treatment and aging furnace needs to be proven to satisfy ±10°F of the threshold temperature during the soak time.

Complementing the TUS system, the Thermal View Survey software provides a means by which the full survey can be set up automatically allowing routine full analysis and reporting to the CQI-9 specification as shown in Figure 6.

Figure 6. View of TUS for T6 aluminum processing in Phase 1 Solution Re-heat
Source: PhoenixTM

Interestingly, a significant further benefit of the thru-process principle is that by collecting process data for the whole process, many of the additional requirements of the process Table C can be achieved with reference to the quench. From the profile trace, key criteria such as quench media temperature, quench delay time, and quench cooling curve can be measured and reported with full traceability during the production run.

Summary

To fully understand, control, and optimize the T6 heat treat process, it is essential the entire process is monitored. Thru-process monitoring solutions, designed specifically, allow not only product temperature profiling of all the solution heat treatment, water quench, and age hardening phases, but also comprehensive temperature uniformity surveying to comply with CQI-9.

About the Author:

Dr Steve Offley (“Dr O”), Product Marketing Manager, PhoenixTM

Dr. Steve Offley, “Dr. O,” has been the product marketing manager at PhoenixTM for the last five years after a career of over 25 years in temperature monitoring focusing on the heat treatment, paint, and general manufacturing industries. A key aspect of his role is the product management of the innovative PhoenixTM range of thru-process temperature and optical profiling and TUS monitoring system solutions.

For more information: Contact Steve at Steve.Offley@phoenixtm.com.


Find Heat Treating Products And Services When You Search On Heat Treat Buyers Guide.Com


Overcoming Quality Challenges for Automotive T6 Heat Treating Read More »

Applying “Thru-Process” Temperature Surveying To Meet the TUS Challenges of CQI-9

Dr. Steve Offley, a.k.a. “Dr. O”

Sponsored content

In the modern automotive manufacturing industry, CQI-9 HTSA (AIAG) has become a key part of driving process and product quality in heat treatment applications. The standard has a broad scope and covers many different aspects of common heat treatment processes (see Process Tables A-H in the standard) and monitoring requirements used. A critical part of the standard is the requirement to perform a temperature uniformity surveys (TUS) in order to validate the temperature uniformity of the qualified work zones and operating temperature ranges of furnaces or ovens used. In this Heat Treat Product Spotlight, Dr. Steve Offley, a.k.a. “Dr. O”, Product Marketing Manager with PhoenixTM, discusses the challenges of performing a TUS on continuous furnace types and one possible solution his company offers.


CQI-9 Heat Treat System Assessment

A critical part of the CQI-9 HTSA (AIAG) standard is the requirement to perform temperature uniformity surveys (TUSs). The TUS is performed to validate the temperature uniformity characteristics of the qualified work zones and operating temperature ranges of furnaces or ovens used. (See Figure 1.)

Fig 1: Schematic showing TUS principle. Thermocouple measurement from the field test instrument, of the furnace’s actual operational temperature, against a setpoint to check that it is within tolerance. Setpoints and tolerances are defined in CQI-9 Process Tables A-H to match each heat treat process.

The “Thru-Process” TUS Principle

Traditionally, TUSs are performed by using a field test instrument (chart recorder or static data logger) external to the furnace with thermocouples trailing into the furnace heating chamber. This technique has many limitations, especially when the product transfer is continuous such as in a pusher or conveyor-type furnace. The trailing thermocouple method is often labor-intensive, potentially unsafe, and can create compromises to the TUS data being collected (e.g., number of measurement points possible, thermocouple damage, and physical snagging of the thermocouple in the furnace).

Fig 2: PhoenixTM thermal barrier being loaded into a batch furnace with a survey frame as part of the TUS process.

The “Thru-Process” TUS principle overcomes the problems of trailing thermocouples as the multi-channel data logger (field test instrument) travels into and through the heat treat process protected by a thermal barrier (Figure 2). The short thermocouples are fixed to the TUS frame. Temperature data is then transmitted live to a monitoring PC running TUS analysis software, via a 2-way RF telemetry link.

Data Logger Options

To comply with CQI-9, field test equipment needs to be calibrated every 12 months minimum, against a primary or secondary standard. The data logger accuracy needs to be a minimum +/-0.6 °C (+/-1.0 °F) or +/-0.1% (TABLE 3.2.1).

Fig 3: PhoenixTM PTM1220 20 Channel IP67 data logger comes calibrated to UKAS ISO/IEC17025 as an option with an onboard calibration data file allowing direct data logger correction factors to be applied automatically to TUS data.

The data logger shown in Figure 3 has been designed specifically to meet the CQI-9 TUS requirements offering a +/- (0.5°F (0.3°C) accuracy (K & N). Models ranging from 6 to 20 channels can be provided with a variety of noble and base metal thermocouple options (types K, N, R, S, B) to suit measurement temperature and accuracy demands (AMS2750E and CQI-9).

Mixed thermocouple inputs can be provided to support the process specific requirements and also allow the use of the data logger to perform system accuracy testing (SAT) to complement the TUS.

Innovative Thermal Barrier Design

Fig 4: “Octagonal” thermal barrier fitted to product/survey tray.

CQI-9 covers a wide range of thermal heat treatment processes and as such the thermal protection for the data logger will vary significantly. A comprehensive range of thermal barrier solutions can be provided to meet specific process temperature requirements and space limitations. Figure 4 shows a unique octagonal thermal barrier designed to fit within the boundaries of the product tray/survey frame used to perform a TUS using the “plane method” (See “Thermocouple Measurement Positions (TUS)” below in this article.). The design ensures maximum thermal performance within the confines of a restricted product tray/basket.

Live Radio TUS Communication

Fig 5: Schematic of LwMesh 2-way RF Telemetry communication link from data logger TUS measurement back to an external computer.

The data logger is available with a unique 2-way wireless RF system option allowing live monitoring of temperatures as the system travels through the furnace. Analysis of process data at each TUS level can be done live allowing full efficient control of the TUS process. Furthermore, if necessary, by using the RF system, it is possible to communicate with the logger installed in the barrier to reset/download at any point pre-, during, and post-TUS. In many processes, there will be locations where it is physically impossible to transmit a strong RF signal. With conventional systems, this results in process data gaps. For the system shown in Figure 2, this is prevented using a unique fully automatic “catch up” feature.

Any data that is missed will be sent when the RF signal is re-established, guaranteeing 100% data transfer.

Thermocouple Options (TUS)

In accordance with the CQI-9 standard (Tables 3.1.3 / 3.1.5), thermocouples supplied with the data logger, whether expendable or nonexpendable, meet the specification requirements of accuracy +/-2.0°F (+/-1.1°C) or 0.4%. Calibration certificates can be offered to allow the creation of thermocouple correction factor files to be generated and automatically applied to the TUS data within the PhoenixTM Thermal View Survey Software. Care must be taken by the operator to ensure that usage of thermocouples complies with the recommended TUS life expectancies and repeat calibration frequencies. Before first use, thermocouples must be calibrated with a working temperature range interval not greater than 250°F (150°C). Replacement or recalibration of noble metal (B, R or S) thermocouples is required every 2 years. For non-expendable base metal (K, N, J, E), thermocouples replacement should be after 180 uses <1796°F (980°C) or 90 uses >1796°F (980°C). For expendable base metal (K, N, J, E), thermocouples replacement should be after 15 uses <1796°F (980°C) or 1 use >1796°F (980°C). Note that base metal thermocouples should not be recalibrated.

Thermocouple Measurement Positions (TUS)

To perform the TUS survey, a TUS frame needs to be constructed to locate the thermocouples over the standard work zone to match the form of the furnace. The TUS may be performed in either an empty furnace in which case thermocouples should be securely fixed as shown in Figure 6. A heat sink (thermal mass fixed to thermocouple tip) can be used to create a thermal load to match the normal product heating characteristics. Alternatively, the thermocouples should be buried in the load/filled product basket. See Figure 6 to see schematics of TUS Frames for a box and cylindrical batch furnace with CQI-9-quoted number of thermocouples required to match void volume (Volumetric Method Table 3.4.1).

Fig 6: TUS Thermocouple Test Rigs. Required number of thermocouples: 1) Work Volume < 0.1 m³ (3 ft³) = 5; 2) Work Volume 0.1 to 8.5 m³ (3 to 300 ft³) = 9; 3) Work Volume > 8.5 m³ one thermocouple for every 3 m³ (105 ft³). (Click on the images for larger display.)

Fig. 7.1, 7.2. PhoenixTM system showing 9 Point TUS survey rig and Thermal View Software TUS frame library file showing as part of TUS report exactly where thermocouples are positioned. (Click on the images for larger display.)

 

For continuous conveyorized furnaces, it is recommended that an alternative thermocouple test rig is employed called the “plane method”. Since the system travels through the furnace it is only necessary to monitor the temperature uniformity over a 2-dimensional plane/slice of the furnace (Figure 8). The required number and location of thermocouples are shown in Table 1 (CQI-9 Table 3.4.2).

(Click on the images for larger display.)

Table 1: Required thermocouples and locations for differing work zones (Plane Method)

(1) 2 Thermocouples within 50 mm work zone corners 1 Thermocouple center. (2) 4 Thermocouples within 50 mm work zone corners. Rest symmetrically distributed.

“Thru-Process” Temperature Uniformity Survey (TUS) Data Analysis and Reporting

Operating the PhoenixTM System with RF Telemetry, TUS data is transferred from the furnace directly back to the monitoring PC where, at each survey level, temperature stabilization and temperature overshoot can be monitored live, with thermocouple and logger correction factors applied. The Thermal View Survey software generates TUS reports which comply with the requirements of AMS2750E/CQI-9 standards.

As defined in CQI-9 (Section 3.4) for furnace with an operating temperature range ≤ 305°F (170°C), one setpoint temperature (TUS level) within the operating temperature range is required. If the operating temperature of the qualified work zone is greater than 305°F (170°C), then the minimum and maximum temperatures of the operating temperatures range shall be tested.

The TUS levels can be automatically set up in the TUS analysis software. Figure 9 shows both the TUS level file and TUS levels applied against the TUS survey trace.

Fig. 9.1

Fig. 9.2

Fig 9.1, 9.2 PhoenixTM Thermal View Survey Software showing TUS Level set-up and application to TUS trace.

Within CQI-9, there is a very prescriptive list of what should be contained in the TUS report (Section 3.4.9).

To comply with all said requirements, the software package provides a comprehensive reporting package as shown below.

Fig 10.1, 10.2, 10.3.  TUS Report showing a TUS profile at three set survey temperatures (graphical and numerical data). The probe map shows exactly where each thermocouple is located and easy trace identification. A detailed TUS report is generated, meeting full CQI-9 reporting requirements. (Click on the images for larger display.)

Overview

The PhoenixTM Thru-Process TUS System provides a versatile solution for performing product temperature profiling and furnace surveying in industrial heat treatment meeting all TUS requirements of CQI-9 within the automotive manufacturing industry, providing the means to understand, control, optimize and certify the heat treat process.

Applying “Thru-Process” Temperature Surveying To Meet the TUS Challenges of CQI-9 Read More »

Temperature Monitoring and Surveying Solutions for Carburizing Auto Components: AMS2750E and CQI-9 Temperature Uniformity Surveys

Dr. Steve Offley (“Dr. O”), Product Marketing Manager PhoenixTM

This is the final installment in a 4-part series by Dr. Steve Offley (“Dr. O”) on the technical challenges of monitoring low-pressure carburizing (LPC) furnaces. The previous articles explained the LPC process and explored general monitoring needs and challenges (part 1), the use of data loggers in thru-process temperature monitoring (part 2), and the thermal design challenge (part 3). In this segment, Dr. O discusses AMS2750E and CQI-9 Temperature Uniformity Surveys. You can find Part 1 here, Part 2 here, and Part 3 here


A significant challenge for many heat treaters is the need to provide products certified to either AMS27150 (aerospace) or CQI-9 (automotive). To achieve this accreditation, furnace Temperature Uniformity Surveys (TUS) must be performed at regular intervals to prove that the furnace set-point temperatures are both accurate and stable over the working volume of the furnace. Historically, the furnace survey has been performed with great difficulty trailing thermocouples into the heat zone. Although it’s possible in a batch process when considering a semi-batch or continuous process, this is a significant technical challenge with considerable compromises as summarized below.

Trailing Thermocouple TUS Process Steps

Figure 1. Typical TUS thermocouple. Positions — 9 point survey. Furnace void corners and center

  • TUS is often carried out using long or ‘trailing’ thermocouples that exit through the furnace door.
  • Furnace often needs to be cooled, then de-gassed so TUS frame can be set up in the furnace.
  • Thermocouples are then led out through the furnace door and connected to a data logger or chart recorder.
  • The furnace is then heated to surveying temperatures.
  • The survey is then carried out, after which furnace is cooled, and thermocouples are removed.

 

Disadvantages of Traditional TUS Process

  • Lots of furnace downtime may be involved (can be up to 24 hours).
  • Thermocouples have to exit the furnace door.
    • This may involve “wedging” the door up, or “grooving” out the hearth to get thermocouples out.
    • Or thermocouples may get caught in the furnace door.
  • A significant amount of the technician’s time is taken up preparing report.

Applying the “Thru-Process” approach to TUS, the measurement system is transferred into the furnace with the survey frame allowing the setup process to be done quickly, safely, and repeatable. (See Figure 2)

Figure 2. PhoenixTM System loaded into a furnace as part of a TUS frame. Thermocouples pre-fitted to the 8 vertices of the cube frame and center. Furnace ambient temperature recorded either with a virgin exposed junction thermocouple (typical MI) or with heat sink damper fitted.

Operating the system with RF telemetry, TUS data is transferred directly from the furnace back to the monitoring PC where at each survey level temperature stabilization and temperature overshoot can be monitored live with TC and logger correction factors applied. The Thermal View Software is developed to ensure that the final TUS report complies fully to the AMS2750E/CQI-9 standards.

Figure 3. PhoenixTM Thermal View Survey Software showing a TUS profile at three set survey temperatures. The probe map shows exactly where each probe is located and easy trace identification. Detailed TUS report generated with efficiency.

Features incorporated into the Thermal View Software to provide full TUS capability include the following:

TUS Level Library – Set-up TUS level templates for quick efficient survey level specification (Survey Temp °F, Tolerance °F, Stabilization, and Survey Times)

TUS Frames Library – Show clearly exact TUS frame construction and probe location using Frame Library Templates – Frame Center and 8 Vertices.

Logger Correction File – Create a logger correction file to compensate TUS readings automatically from the logger’s internal calibration file.

Thermocouple Correction File – Create the thermocouple correction file and use to compensate TUS readings directly.

TUS Result Table & Graph View – For each TUS temperature level, see from the graph or TUS table instantaneously full survey results.

Furnace Class Reporting – Report the specified furnace class at each temperature level.

 

 

Overview

The PhoenixTM Temperature Profiling System provides a versatile solution for both performing product temperature profiling and furnace TUS in industrial heat treatment. It is designed specifically for the technical challenges of low-pressure carburizing (LPC) whether implementing either high-pressure gas quench or oil quench methodology, providing the means to Understand, Control, Optimize and Certify the LPC Furnace and guarantee product quality and process operation efficiency and certification.

 

Temperature Monitoring and Surveying Solutions for Carburizing Auto Components: AMS2750E and CQI-9 Temperature Uniformity Surveys Read More »

Temperature Monitoring and Surveying Solutions for Carburizing Auto Components: Thermal Barrier Design

This is the third in a 4-part series by Dr. Steve Offley (“Dr. O”) on the technical challenges of monitoring low-pressure carburizing (LPC) furnaces. The previous articles explained the LPC process and explored general monitoring needs and challenges (part 1) and the use of data loggers in thru-process temperature monitoring (part 2). In this segment, Dr. O discusses the thermal barrier with a detailed overview of the thermal barrier design for both LPC with gas or oil quench. You can find Part 1 here and Part 2 here


Low-Pressure Carburizing (LPC) with High-Pressure Gas Quench – the Design Challenge

A range of thermal barriers is available to cover the different carburizing process specifications. As shown in Figure 1 the performance needs to be matched to temperature, pressure and obviously space limitations in the LPC chamber.

 

Fig 1: Thermal Barrier Designed Specifically for LPC with Gas Quench.

(i) TS02-130 low height barrier designed for space limiting LPC furnaces with low-performance gas quenches (<1 bar). Only 130 mm/5.1-inch high so ideal for small parts. Available with Quench Deflector kit. (0.9 hours at 1740°F/950°C).

(ii) Open barrier showing PTM1220 logger installed within phase change heatsink.

(iii) TS02-350 High-Performance LPC barrier fitted with quench deflector capable of withstanding 20 bar N2 quench. (350 mm/13.8-inch WOQD 4.5 hours at 1740°F /950°C).

(iv) Quench Deflect Kit showing that lid supported on its own support legs so pressure not applied to barrier lid.

The barrier design is made to allow robust operation run after run, where conditions are demanding in terms of material warpage.

Some of the key design features are listed below.

I. Barrier – Reinforced 310 SS strengthened and reinforced at critical points to minimize distortion (>1000°C / 1832°F HT or ultra HT microporous insulation to reduce shrinkage issues)

II. Close-pitched Cu-plated rivets (less carbon pick up) reducing barrier wall warpage

III. High-temperature heavy duty robust and distortion resistant catches. No thread seizure issue.

IV. Barrier lid expansion plate reduces distortion from rapid temperature changes.

V. Phase change heat sink providing additional thermal protection in barrier cavity.

VI.  Dual probe exits for 20 probes with replaceable wear strips. (low-cost maintenance)


LPC or Continuous Carburizing with Oil Quench – the Design Challenge

Although commonly used in carburizing, oil quenches have historically been impossible to monitor. In most situations, monitoring equipment has been necessarily removed from the process between carburizing and quenching steps to prevent equipment damage and potential process safety issues. As the quench is a critical part of the complete carburizing process, many companies have longed for a means by which they can monitor and control their quench hardening process. Such information is critical to avoid part distortion and allow full optimization of hardening operation.

When designing a quench system (thermal barrier) the following important considerations need to be taken into account.

  • Data logger must be safe working temperature and dry (oil-free) throughout the process.
  • The internal pressure of the sealed system needs to be minimized.
  • The complexity of the operation and any distortion needs to be minimized.
  • Cost per trial has to be realistic to make it a viable proposition.

To address the challenges of the oil quench, PhoenixTM developed a radical new barrier design concept summarized in Figure 2 below. This design has successfully been applied to many different oil quench processes providing protection through the complete carburizing furnace, oil quench and part wash cycles.

Fig 2: Oil Quench Barrier Design Concept Schematic

(i) Sacrificial replaceable insulation block replaced each run.

(ii) Robust outer structural frame keeping insulation and inner barrier secure.

(iii) Internal completely sealed thermal barrier.

(iv) Thermocouples exit through water/oil tight compression fittings.


In the next and final installment in this series, Dr. O will address AMS2750E and CQI-9 Temperature Uniformity Surveys, which often prove to be challenging for many heat treaters. “To achieve this accreditation, Furnace Temperature Uniformity Surveys (TUS) must be performed at regular intervals to prove that the furnace set-point temperatures are both accurate and stable over the working volume of the furnace. Historically the furnace survey has been performed with great difficulty trailing thermocouples into the heat zone. Although possible in a batch process when considering a semi-batch or continuous process this is a significant technical challenge with considerable compromises.” Stay tuned for the next article in the series of Temperature Monitoring and Surveying Solutions for Carburizing Auto Components.

Temperature Monitoring and Surveying Solutions for Carburizing Auto Components: Thermal Barrier Design Read More »

Temperature Monitoring and Surveying Solutions for Carburizing Auto Components: Introduction

This is the first in a 4-part series by Dr. Steve Offley (“Dr. O”), Product Marketing Manager at PhoenixTM, on the technical challenges of monitoring low-pressure carburizing (LPC) furnaces. This introductory article explains the LPC process and general monitoring needs and challenges. 


Carburizing Process

Dr. Steve Offley (“Dr. O”), Product Marketing Manager PhoenixTM

Carburizing has rapidly become one of the most critical heat treatment processes employed in the manufacture of automotive components. Also referred to as case hardening, it provides necessary surface resistance to wear while maintaining toughness and core strength essential for hardworking automotive parts.

The carburizing heat treatment process is commonly applied to low carbon steel parts after machining, as well as high alloy steel bearings, gears, and other components. Being critical to product performance, monitoring and controlling the product temperature in the heat treatment process is essential.

The carburizing process is achieved by heat treating the product in a carbon-rich environment, typically at a temperature of 900 – 1050 °C / 1652 – 1922 °F. The temperature and process time significantly influences the depth of carbon diffusion and associated surface characteristics. It is critical to the process that, following diffusion, a rapid quenching of the product is performed in which the temperature is rapidly decreased. This generates the microstructure giving the enhanced surface hardness while maintaining a soft and tough product core.

Increasing in popularity in the carburizing market is the use of batch or semi-continuous batch low-pressure carburizing furnaces. New furnace technology employs the dissociation of acetylene (or propane) to produce carbon in an oxygen-free low-pressure vacuum environment, which diffuses to a controlled depth in the steel surface. Following the diffusion, the product is transferred to a high-pressure gas quench chamber where it is rapidly gas cooled using typical N2 or Helium up to 20 bar.

An alternative to gas quenching is the use of an oil quench, used commonly in continuous carburizing furnaces where the products are plunged into an oil bath.

 

Fig 1: Schematics of the LPC Carburizing process showing the Temperature and Pressure steps

Temperature Monitoring Challenges in Low-Pressure Carburizing

As already stated, the success of the carburizing process is governed by careful control of both the process temperature and duration in the heating and quench stages. Obviously, when considering temperature, we are interested in the product temperature, not the furnace. Measuring product temperature through a carburizing process, although possible using trailing thermocouples, as performed historically, is neither easy nor safe, and it disrupts production for lengthy periods.

PhoenixTM provides a superior solution with the use of a “thru-process” temperature monitoring system. As the name suggests, the PhoenixTM temperature profiling system is designed to travel through the thermal process, measuring the product and or furnace environment from start to finish. The system can be incorporated into a standard production run so does not compromise productivity. A high accuracy, multi-channel data logger records temperature from thermocouple inputs, located at points of interest on, in, or around the product being thermally treated. To protect the data logger as it travels through the hostile furnace, a thermal barrier is employed to keep the logger at a safe working temperature to prevent damage and ensure accuracy of measurement. The barrier also obviously needs to protect during the quench, whether that be against high pressure or oil ingress if the quench can’t be avoided.

Employing the PhoenixTM system a complete thermal record of the product throughout the entire process can be collected. A popular enhancement to the system is the use of 2-way RF telemetry, providing real-time process monitoring directly from the furnace, useful for either profiling or performing a live Temperature Uniformity Survey (TUS). The product temperature can be viewed live and downloaded at any point in the furnace. Raw temperature data collected from the process can be converted into useful information using one of the custom-designed PhoenixTM Thermal View Software packages available. The thermal graph can be reviewed and analyzed to give a traceable, certified record of the process performance. Such information is critical to satisfying CQI-9, AMS2750, and other regulatory demands. Fully TUS-compliant reports can be produced in moments from the simple and intuitive software, making accurate TUS a simple and quick task. Information can be used to not only prove product quality but provide the means to confidently change process characteristics to improve productivity and process efficiency (Optimize Diffusion, Soak and Quench).

Temperature Monitoring and Surveying Solutions for Carburizing Auto Components: Introduction Read More »

A Dozen Quick Heat Treat News Items to Keep You Current

A Dozen Quick Heat Treat News Items to Keep You Current

Heat Treat Today offers News Chatter, a feature highlighting representative moves, transactions, and kudos from around the industry.

Personnel and Company Chatter

  • Lars Jonsson has been appointed new employee representative to the Bulten AB board for a term of office of three years. Lars has been employed at Bulten’s plant in Hallstahammar since 1984 where he has worked as a machine operator until two years ago when he switched to tool maker. Bulten is a supplier of fasteners to the international automotive industry.
  • Paulo’s Cleveland division is expanding to add 50,000 sq ft, allow for the installation of more vacuum heat treatment furnaces, and provide optimized flow of work through the facility. The Cleveland plant primarily serves the aerospace, power generation, and agriculture industries and specializes in precise high-temperature vacuum heat treatment and brazing of nickel-based superalloys.
  • James R. Darsey, executive vice president of raw materials for Nucor Corporation announced his upcoming retirement after more than 39 years of service with Nucor.  Effective June 10, 2018, Craig A. Feldman will be promoted to the role of executive vice president of raw materials.  Mr. Feldman began his career with the David J. Joseph Company (DJJ) in 1986 and stayed with the company, becoming president of DJJ in 2013. When DJJ was acquired by Nucor in 2008, Feldman became a vice president and general manager of Nucor Corporation.  Upon his promotion to EVP, Mr. Feldman will retain his role as president of DJJ.
  • Dr. Steve Offley recently joined Phoenix Temperature Measurement as Product Marketing Manager, bringing over 21 years of experience in the industrial process temperature monitoring industry. Besides promoting and marketing PhoenixTM’s temperature monitoring productions, Dr. Offley will focus on development of new and innovative processes temperature-monitoring solutions.
  • Solar Atmospheres of Western PA recently installed a second machining center to support its newest service for customers – tensile testing. By adding a brand new fully programmable 8100 RPM Haas VF2 milling center, Solar is now able to support the machining of flat tensile specimens. This machining ability fully complements the function of the 10,000 PSI hydraulic jaw that is an integral component of the Tinius Olsen 300SL tensile machine. These massive hydraulic jaws can grip either threaded round or flat specimens.

Equipment Chatter

  • A 750°F (399°C), gas-fired cabinet oven was recently supplied by Grieve Corporation to be used for baking radiator cores at the customer’s facility. The workspace dimensions of this oven, the No. 1046, measure 80″ W x 88″ D x 18″ H, with a 76″ wide x 76″ long, 750 lb. capacity pneumatic operated rollout shelf with an insulated plug to seal doorway opening.
  • An intermediate-sized front-loading box furnace was recently delivered to the Canadian Government Forestry Division. L&L Special Furnace Co., Inc., equipped the furnace, which meets Canadian electrical standards, with an atmosphere-sealed case for use with inert gas to displace oxygen and minimize surface de-carb. It is purged with inert gas prior to loading and the parts are then heated under a controlled atmosphere. There is a 1″ NPT survey port located on the right side of the furnace employed for calibration and uniformity surveys.
  • An aerospace and defense company recently purchased a rotary hearth furnace to heat treat state-of-the-art equipment, specifically to process specialized gears for helicopters. Ipsen plans to deliver the rotary hearth furnace early next year.

Kudos Chatter

  • Advanced Heat Treat Corp. recently reported that its Iowa (MidPort-Corporate Office and Burton Ave, Waterloo) and Michigan locations have successfully transitioned from ISO/TS 16949:2009 to ISO 9001:2015 / IATF 16949:2016.
  • Stock Drive Products/Sterling Instrument (SDP/SI) has also announced that it meets all certification requirements of the new ISO 9001:2015 + AS9100D standard, maintaining processes that provide superior components and assemblies with detailed quality reporting.
  • Harbison Walker International (HWI), based in western Pennsylvania, adds its announcement to the mix, reporting that its Thomasville, Georgia, monolithic/precast facility recently became the first of the company’s North American locations to earn ISO 9001:2015 certification, followed by HWI’s South Shore, Kentucky, plant becoming the first refractory brick manufacturing plant in North America to achieve the same status. Both plants achieved this quality system recognition based on the recommendation of SRI Quality System Registrar.
  • Rio Tinto celebrates the distinction of being the first company in the world to receive certification under the Aluminium Stewardship Initiative (ASI), the highest internationally recognized standard for robust environmental, social and governance practices across the aluminum lifecycle of production, use, and recycling. The certification follows an independent third party audit and covers a range of operations, from bauxite mining to alumina refining, aluminum smelting, the creation of value-added products, transformation and recycling, and associated activities. Rio Tinto’s five aluminum smelters, the Vaudreuil refinery, casting and spent potlining treatment centers, and associated infrastructure such as power, port and railway facilities in Quebec, Canada, have been certified, along with the Gove bauxite mine in Australia.

Heat Treat Today is pleased to join in the announcements of growth and achievement throughout the industry by highlighting them here on our News Chatter page. Please send any information you feel may be of interest to manufacturers with in-house heat treat departments especially in the aerospace, automotive, medical, and energy sectors to the editor at editor@heattreattoday.com.

A Dozen Quick Heat Treat News Items to Keep You Current Read More »

Skip to content