CONTROLS TECHNICAL CONTENT

Temperature Control System Improves Precision, Efficiency on Heat Treat Equipment: A Case Study

A century-old producer of die forgings recently needed to improve the process controls on its heat treating furnaces.

With process controls well over 10 years old, Clifford-Jacobs turned to Conrad Kacsik to improve its temperature process control system. The company, which serves a number of industries, including energy, aerospace, construction, mining, forestry, and rail, was eager to improve its temperature process control system, particularly because the incumbent system was producing inconsistent work.

The Challenge

Bud Kinney, Vice President of Innovation and Technology at IMT Corporation

Clifford-Jacobs was not getting consistent, repeatable results from its furnaces. The company also wanted more efficient and automated processes with data acquisition and electronic operating capability.

“We looked at a number of controls companies throughout the Midwest and interviewed them to learn about their experience with system controls and data acquisition,” said Bud Kinney, Vice President of Innovation and Technology at IMT Corporation, the parent of Clifford-Jacobs. “We knew we wanted an integrated system so we started looking at companies that did that as a matter of course. Most companies are limited to traditional controls, but Conrad Kacsik has a lot of experience doing the exact type of job we needed.”

Increasing Demands

Clifford-Jacobs makes forged parts for a variety of clients. Although forging does not generally require as much precision as other types of processes, customers are increasingly demanding, said Kinney.

“We believe that sooner rather than later things like Nadcap will come into forging, and our customers are very interested in us being able to demonstrate that our processes are always in control, even forge heating,” Kinney said. “This project helps ensure that we meet those needs. We couldn’t track things like set-point input values before. That’s another element we wanted to manage.”

The System

Retrofitting Clifford-Jacobs heat treating system.

Conrad Kacsik built a full process temperature control system that includes SCADA software from SpecView. They were able to retrofit the system on Clifford-Jacobs’ existing 16 furnaces, saving the company considerable expense and time. The temperature process control system uses Watlow F4T controllers paired with SpecView SCADA software, which allows for programming jobs/recipes, remote operation, secure (password protected) operation of furnaces and accurate automatic temperature recording. Conrad Kacsik also added alert lights that allow the operators to quickly see the status of each furnace from the shop floor.

H2: Benefits of Temperature Control System Integration

Clifford-Jacobs has noted several beneficial results from the new temperature control system. These include:

  • Increased accuracy. The new system runs each recipe exactly and records the results. The company can also control which employees can adjust temperature settings, preventing operators from rushing jobs with a higher temperature or inadvertently setting the furnace incorrectly.
  • Higher efficiency. With preprogramming, each furnace is always at the exact temperature it needs to be for the given task. An automatic preheat setting also safely prepares the furnace for the workday—eliminating downtime or the need to send an employee in early to start the furnaces.
  • More speed. Clifford-Jacobs can pre-program any recipe it needs, allowing for highly accurate and fast running of complex processes.
  • More convenience. Clifford-Jacobs can operate their furnaces from anywhere with an internet connection, or via an iPad used by an approved employee.
  • Precision for the future. The new system can be part of a Nadcap-approved process should the need arise. The SpecView software and advanced controllers automatically record each job and retain all data for verification.

The Results

“We used to have to use all kinds of resources to provide oversight on temperature control,” said Kinney. “This has given us a heating strategy. We write the recipes we want and just select from those. In addition to that, we know exactly what every furnace is doing at all times.”

The company is also pleased with the increased efficiency. They only heat product when they are ready to run production, and the furnace only uses the exact energy needed for each recipe. They are also saving on staffing, as they used to have to schedule people to ensure the furnace was at the right temperature.

“With this system, we can develop recipes for each part we make, which is both convenient and precise. It’s doing exactly what we expected it to do,” said Kinney.

Temperature Control System Improves Precision, Efficiency on Heat Treat Equipment: A Case Study Read More »

Heat Treat Control Panel: Best Practices in Digital Data Collection, Storage, Validation

When processing critical components, heat treaters value and demand precision in every step of the process — from the recipe to data collection — for the sake of accurate performance of the furnace, life expectancy of all equipment, as well as satisfactory delivery of a reliable part for the customer.

So what’s the obstacle to achieving those goals? Gunther Braus of dibalog GmbH/dibalog USA Inc. says, “The general problem is the human.” Indeed, the need to remove the variable of human fallibility plays a significant role in the search and development of equipment that could sense, read, and record data separate from any input from the operator. “As long there is a manual record of values there is the potential failure,” adds Braus.

Now, as part of the quest for precision, particularly in the automotive and aerospace industries, many control system requirements are driven by the need to prove process compliance to specified industry standards like CQI-9 and AMS 2750. These standards allow for and frequently require digital data records and digital proof of instrumentation precision.

With this in mind, Heat Treat Today asked six heat treat industry experts a controls-related question. Heat Treat Control Panel will be a periodic feature so if you have a control-related question you’d like addressed, please email it to Editor@HeatTreatToday.com and we’ll put your question to our control panel.

Q: As a heat treat industry control expert, what do you see as some of the best practices when it comes to digital data collection and storage and/or validation of instrumentation precision?

We thank those who responded: Andrew Bassett of Aerospace Testing & Pyrometry, Inc.; Gunther Braus, dibalog GmbH/dibalog USA Inc; Jim Oakes of Super Systems, Inc; Jason Schulze, Conrad Kascik Instrument Systems, Inc.; Peter Sherwin, Eurotherm by Schneider Electric; and Nathan Wright of C3Data.

Calibration and Collection

Jim Oakes (Super Systems Inc.) starts us off with an overview of the equipment review process, the crucial component of instrument calibration, and digital data collection:

“Industry best practices are driven by standards defined by the company and customers they serve. Both the automotive and aerospace industries have a set of standards which are driven through self-assessments and periodic audits. Instrument precision is defined by the equipment’s use and is required to be checked during calibrations. The frequency of these calibration depends on the instrument and what kind of parts and processes it is responsible for.

The equipment used for these processes can be defined as field test instrumentation, controllers, and recording equipment. Calibration is required with a NIST-traceable instrument that has specific accuracy and error requirements. Before- and post-calibration readings are required (commonly identified as “as found” and “as left” recordings). During calibration, a sensitivity check is required on equipment and is recorded as pass/fail. The periodic calibration procedure is carried out not only on test equipment but also on control and recording equipment, to ensure instrument precision.

Digital data collection is a broad term with many approaches in heat treatment. As mentioned, requirements are driven by industry standards such as CQI-9 and AMS 2750. Specifically when it comes to digital data collection, electronic data must be validated for precision; checked; and calibrated periodically as defined by internal procedures or customer standards. Data must be protected from alteration, and have specific accuracy and precision. Best practice tends to be plant wide systems that cover the electronic datalogging that promotes ease of access to current and historical data allowing use for quality, operational, and maintenance personnel. Best practices in many cases are defined by the standards within each company, but the hard requirements are often the AMS 2750 and CQI-9 requirements for digital data storage.”

Industry Guidelines and Requirements

Andrew Bassett (Aerospace Testing & Pyrometry) has provided us with a reminder of the industry guidelines for aerospace manufacturing (via AMS-2750E, paragraph 3.2.7.1 – 3.2.7.1.5)

  1. The system must create electronic records that cannot be altered without detection.
  2. The system software and playback utilities shall provide a means of examining and/or compiling the record data, but shall not provide any means for altering the source data.
  3. The system shall provide the ability to generate accurate and complete copies of records in both human readable and electronic form suitable for inspection, review, and copying.
  4. The system shall be capable of providing evidence the record was reviewed – such as by recording an electronic review, or a method of printing the record for a physical marking indicating review.
  5. The system shall support protection, retention, and retrieval of accurate records throughout the record retention period. Ensure that the hardware and or software shall operate throughout the retention period as specified in paragraph 3.7.
  6. The system shall provide methods (e.g., passwords) to limit system access to only individuals whose authorization is documented.

“One of the biggest issues I see with one of these requirements will be point 5,” says Bassett. “The requirement is to be able to review these records throughout the retention period, which in some instances is indefinite. I always recommend to clients who may be upgrading or purchasing new digital systems that they should consider keeping a spare system in place to be able to satisfy this requirement. Who knows — today we are working on Windows 10, but in 50 years, will our successor be able to go back and review heat treat data when everything is run on Windows 28?”

Jason Schulze, Aerospace Heat Treating“This is a topic that yields great discussions,” adds Jason Schulze (Conrad Kascik). He directs us to a challenge he sees from time to time.

Within the Nadcap AC7102/8 checklist, there is this question: “Do recorder printing and chart speeds meet the requirements of AMS 2750E Table 5 or more stringent customer requirements?” This correlates with AMS2750E, page 12, paragraph 3.2.1.1.2 “Process Recorder Print and Chart Speeds shall be in accordance with Table 5”.

“To ensure the proper use of an electronic data acquisition unit used on furnaces and ovens, these requirements must be understood,” continues Schulze. “Because this system is electronic, it should be designated a digital instrument and not an analog instrument. In doing so, this helps determine what requirements apply in Table 5. The only remaining requirement in Table 5 for digital instruments is ‘Print intervals shall be a minimum of 6 times during each time at temperature cycle. Print intervals shall not exceed 15 minutes.’

With this in mind, it is important to realize that, if your time at temperature cycles are short cycles (such as vacuum braze cycles), the sample rate of data collection may need to be adjusted to ensure it is recorded 6 times during the cycle.

As an example, if the shortest cycle processed is 4 minutes at temperature, a sample rate of every 60 seconds would not conform to AMS2750E because, in theory, the maximum amount of recordings would be 4 times during the time at soak. Now, if the sample rate was modified to every 30 seconds, this would allow ~8 recordings during the time at soak, which then would be conforming to AMS2750E.

Within the realm of electronic data acquisition on furnaces/ovens, this seems to be a frequent challenge for suppliers.”

A Critical Variable: Process Temperature

Nathan Wright (C3Data) agrees and zeroes in on process temperature as a critical variable to be measured:

“No matter the heat-treating process being carried out, complying with AMS-2750 and/or CQI-9 requires that the heat treater measure, record, and control several different variables. One of the more common variables that must be measured, recorded, and controlled is process temperature.

Measuring process temperatures requires the use of a precise measurement system (Figure-1 below), and the accuracy of said measurement system must be periodically validated to ensure its ongoing compliance.”

“The validation process is carried out through a series of pyrometric tests (Instrument Calibration and SAT), and historically these validation processes are highly error-prone.

In order to help ensure process instrumentation, process temperatures, and any other variable that impacts quality is properly validated it is good practice to begin automating compliance processes whenever and wherever possible. C3 Data helps automate all furnace compliance processes using software.”

A “Standard” Mindset

Gunther Braus (dibalog) chimes back in with some pertinent wisdom: “It is not sufficient only to record, you must live the standards like CQI-9, AMS, Nadcap or even your own standard you have set up, so you must survey the data. However, in the old times, there was a phrase: the one who measures, measures crap. In the end, it is all about surveillance of the captured data.

Where you store the data is a question of philosophy: personally, I prefer local storage in-house. Yes, we all talk about IOT, etc., and I do not want to start a discussion about security; it is more about accessing the data. No internet, no data. So simple. We are overly dependent upon cloud usage on the internet.

The automation of the instrumentation precision is so much effort in terms of automated communication between testing device and controller, from my point of view we are not there yet.”

A Look at the Standards In and Outside the Industry

Interesting question! writes Peter Sherwin (Eurotherm by Schneider Electric).

The aim is to record the true process temperature seen by the components being treated. However, there are many practical factors that can alter the accuracy of the reading. From the position of the thermocouple (TC), the TC accuracy (over time), suitability of the lead or extension wire, issues with CJC errors and instrument accuracy as well as electrical noise impacting the stability of the reading.

The standards do a good job to help by prescribing the location of TC, accuracies required for both TC and instrument, and frequent checks over time through TUS and SAT checks but note the specification requirements are maximum “errors”. And if you truly want to reach world-class levels of process control and reap the inherent benefits of better productivity and quality, you should aim to be well inside those tolerances allowed.

With 30yrs+ of data required to be stored (in certain cases, particularly aerospace), there should be some thought as to how and what form this should be stored in. There are many more options of storage when the data is in digital format.

  • Paper is very costly to store and protect.
  • The virgin data file should be secure and tamper-resistant and identical copies made for backup purposes held offsite.
  • The use of FTP is becoming more common to move files automatically from the instrument to a local server (with its own backup procedures to ensure redundant records in case of disaster).
  • Regular checks should be made to examine the availability and integrity of these electronic records.
  • Control and Data Instrument suppliers should ideally have many years of supplying instrument digital records with systems that can access even the earliest of data record formats.

We also look outside of the heat treat standards for truly best practices. The FDA regulation 21CFRPart11 and associated GAMP Good Automated Manufacturing Practice have been extended with the new document “Data Integrity and Compliance with Drug cGMP, Questions and Answers, Guidance for Industry”. These updates leverage A.L.C.O.A to describe the key principles around electronic records (see below). This industry is also leading the requirement for sFTP a more secure format of the FTP protocol.


Heat Treat Today will run this column regularly featuring questions posed to and answered by industry experts about controls. If you have a question about controls and/or data as it pertains to heat treating, please submit it to doug@heattreattoday.com or editor@heattreattoday.com.

Heat Treat Control Panel: Best Practices in Digital Data Collection, Storage, Validation Read More »

Heat Treat Radio #15: Jim Oakes

Welcome to another episode of Heat Treat Radio, a periodic podcast where Heat Treat Radio host, Doug Glenn, discusses cutting-edge topics with industry-leading personalities. Below, you can either listen to the podcast by clicking on the audio play button, or you can read an edited version of the transcript. To see a complete list of other Heat Treat Radio episodes, click here.


Audio: Jim Oakes

In this conversation, Heat Treat Radio host, Doug Glenn, speaks with Jim Oakes from Super Systems, Inc., based in Cincinnati, Ohio. SSI develops and manufactures products for the thermal processing industry, including probes, analyzers, flow meters, controllers, software solutions, and engineered systems. Jim Oakes of Super Systems corrals the data about data and makes sense of its use in the heat treating world, covering topics that include the evolution of data collection, sensor technology, data collection for preventative maintenance, operational benefits of data collection, Super Systems data capture explained, the Cloud and security.

Click the play button below to listen.


Transcript: Jim Oakes

The following transcript has been edited for your reading enjoyment.

On this episode of Heat Treat Radio, we’re discussing data.  If there is one thing that is significantly changed in the Heat Treat world in the last decade, it’s the quantity and quality of data.  What the heck do you do with all the data?  How do you collect it?  How do you decide which data sets the capture and after you capture them, how can you learn anything from them? Data, data everywhere, and not a drop to drink!

Welcome to Heat Treat radio.  I am your host and publisher of Heat Treat Today, Doug Glenn.  Today, we’re going to talk to one of the industries leading authorities on data, Jim Oakes from Super Systems Inc.  But before we do, why don’t you take a little cyber trip over to heattreattoday.com and see all the data we have there?  We’ve got aerospace heat treat data, we’ve got automotive heat treat data, we’ve got medical heat treat data and energy heat treat data as well as general manufacturing heat treat data.  In fact, we’re adding at least one new piece of heat treating data every day.  On Tuesday, we publish technical content.  We call it ‘Technical Tuesday.  If you’re a manufacturer within in-house heat treat, we’re pretty sure you’re going to find heattreattoday.com really helpful.

Before we get started, here is a word about this episode’s sponsor:  Today’s Heat Treat Radio is brought to you by Dry Coolers, designers and builders of industrial cooling systems and the professional engineering services surrounding those systems.  As a leader in the heat treat industry for decades, they’re located in Oxford, Michigan and supply cooling systems for the aerospace, automotive, medical and energy industries, plus many others.  If you have any industrial cooling needs, call Dry Coolers.  You can find them on the web at www.drycoolers.com or by phone at 800-525-8173.

Doug Glenn (DG):  Let’s get started on today’s topic — data. Our guest is Jim Oakes from Super Systems Inc.  Hi, Jim. Take a minute and introduce yourself to our listeners.

Jim Oakes (JO):  Hi, Doug, this is Jim Oakes with Super Systems. We’re a technology provider for the heat treating industry. We focus on sensors, controls, and software for the thermal processing and heat treating industry, and we’ve been doing that for over 20 years now.

DG:  Jim, how many years have you been in the heat treat industry?

JO:  15 years.

DG:  Over the past 15 years, what impresses you about the way we are using data now as opposed to the way we used it back then?

JO:  Well, a couple things, actually. My introduction to the industry was actually longer ago than 15 years. I started in an internship, and oddly enough, at that internship — it was for a technology provider in the heat treating industry — I was involved in doing data capture from a PLC at a Timken plant in Gaffney, South Carolina, and that was 25 years ago. Data acquisition has been happening not just in the heat treating industry, but in manufacturing for a very long time. What’s really been changing though, if you look at the last 10 to 20 years, is that the technology is lending itself, because of cost, both from a storage standpoint and processing standpoint, to really being accessible everywhere. You have more information that is coming out of microprocessor controls or PLCs or programmable logic controllers throughout the shop floor. Whether it be a piece of thermal processing equipment or a cooler or anything that is on the shop floor, we have tons of information that is becoming available. Before you might have been worried about how you would store all that information, but that is a thing of the past. The amount of information, and actually making sense of all of it, is where the challenge lies today, certainly not collecting it.

The Evolution of Data Collection

DG:  Ten years ago, are you seeing us collecting anything now that we didn’t collect then? Are we collecting more stuff than we were collecting back then, and if so, what are we collecting now that we weren’t collecting before?

JO:  That’s a great question, Doug, because back then a lot of the data was very specific and focused on process-related information. Now, there is additional data that is being collected that can be used for some predictive modeling, if you will. It’s not just proof of process that meets the industry requirements. Your customers were expecting that if you used a heat treatment process, then you had to really prove you performed that. Well, that’s a thing of the past. Of course, any data acquisition system that you have today, or anything data-related is going to provide you with that. But now there is more data, so on any day, in any heat treat facility, captive or commercial, I’d say there are 750,000 to well over a million data points that are being collected. Honestly, most people don’t even know that they’re collecting all that information. Their laser focus is on that one specific requirement. All that information that you can have is coming from these microprocessors or PLCs, so the amount of information today versus what you were gathering way back when is really one of the biggest differences.

DG:  What are some of the technologies that have driven that change so that now we can collect more?

JO:  A couple things. Standardized protocols have been around for capturing data, so you have to have a mechanism to get the data from all of these different pieces of equipment. That’s one piece. It’s existed for a long time. But if you think about it, if you take the shop floor today versus 10 years ago versus 20 years ago, there is a PC everywhere now.  You have a networking infrastructure that exists that maybe wasn’t there 20 years ago. Maybe you had a limited number of people that would be able to absorb that information and utilize it. Today, everyone is using a computer. Everybody is using a hand-held device. Now, all of a sudden, that information is readily available to lots of people, and that’s where the difference is. Not only do you have the networking infrastructure on the manufacturing on the shop floor, but you also have the technology that is available to everybody. Computers are everywhere.

Sensor Technology

DG:  One of the contentions I have is that the reason we’re able to gather so much more data now is that we’ve had advances in sensor technology. Maybe you can address this a bit.  I think there are things we are capturing now that we weren’t even able to capture before because of advances in sensors, whether it be IR sensors, or whatever.

JO: Yes, you’re right, Doug. If you look at the amount of information that is readily available, it is because of the technology that is available to capture it. There is all this sensor technology, whether it’s a limit switch identifying a basket or a tray moving to a specific location, or an infrared device that is used maybe for just measuring temperature on the outside of a furnace shell or an infrared analyzer used for analyzing the gas inside the chamber where the parts are being heat treated. Now you have the ability to take that additional information and use it for a decision making process.

And now you have all this data. Nobody is concerned about the amount of information you’re storing. Nobody ever says, “Well, we’re not going to have that much space.” The problem is people and time in actually evaluating all of the data. No doubt, using a sensor to monitor vibration of a pump or motor, or looking at the current usage, or looking at gas usage — the list goes on of the amount of information you can gather and this is because the cost has gone down. Each of those specific devices are now lower in cost and reasonably achievable from a data capture standpoint.

DG:  We might describe it as to say something like:  In the past, we used to put all the sensors inside the furnace, as you mentioned, to validate the process and things of that sort. It seems now that, because of cost of sensors and things of that sort, the fact that you can gather all this data and actually do something with it now, that we’re getting sensors on the outside of the equipment to make sure not that just the process is validated, but that the equipment is also validated, if you will, so that we can see troubles coming and that type of thing. Do you agree?

JO:  Yes, there is no doubt if you look at some of the benefits of what we see in the heat treating industry today. Of course, operational efficiencies are important. Now you’re taking the data that you’re gathering, again it’s not going to just prove that you’re running the parts properly, but you’re able to make better decisions from an operational standpoint. You can look for better load planning, you can look for reducing time between loads or gap time between loads and identify what’s causing those. The other thing is using this information for preventive maintenance. The equipment manufacturers are doing a great job with providing preventive maintenance programs and it is because of the sensors and the data acquisition systems that you are able to even just locally to that piece of equipment or gather from a plant-wide standpoint. There is no doubt, that some of the biggest benefits are from doing the data capture and then having this different sensor technology that allows for the preventive maintenance programs that can be put into place.

DG:  Isn’t that, in fact, where huge benefits can be gained, in the area of preventative maintenance?

Preventative Maintenance

JO:  Unplanned downtime is a huge cost component in heat treating. Anything you can do to manage the up-time of your equipment is beneficial. Of course, planned downtime gives you an opportunity to work with customers, work with the product that is flowing through your facility as well as managing the incoming parts that you might need for that equipment. So it’s a huge benefit. You can still do preventive maintenance programs that are in place; it doesn’t have to be with new equipment. You just have to be smart about the things that are important to that equipment and then utilize that data. I always say that data acquisition is very underutilized when it comes to maintenance. The maintenance department is usually one of the busiest groups within the thermal processing industry. A lot of domain knowledge goes into the equipment, but they have a lot of this information that is readily accessible to them, so if they could look at this information and anticipate that fan is going to fail, that motor is going to fail, that there is a short on your electrical elements, or whatever that might be, you’re going to be able to plan for the downtime. That’s going to help from an operational standpoint as well as reduce the amount of time that that furnace might be out of commission.

DG:  And when you’re not planning ahead, when you’re responding to fires rather than preventing fires, so to speak, it is usually the maintenance guys who catch the brunt of it.

JO:  Yes, that poor guy walks into work every day dreading work because he’s got a crisis on his hands every single time. If you can prevent that crisis, so he can plan to do something, it’s a totally different work environment.

Let’s take a quick break here and remind you that additional support for today’s Heat Treat Radio episode is being provided by Dry Coolers. If there is one thing we know about thermal processes, it’s that things get hot, and to remove that heat from critical areas, you need a system that is reliable, and if necessary, designed for your specific needs. The fact is, Dry Coolers has been custom designing and providing standardized units for decades, and they have the staff and experience to take care of any of your industrial cooling needs. If you’re a manufacturer with in-house heat treating and you need an industrial strength cooling system, make you first, and only call to Dry Coolers. You can look them up on the web at www.drycoolers.com.

Now let’s get back to our interview with Jim Oakes of Super Systems.

DG:  Where are you seeing data being used well?

Operational Benefits of Data Collection

JO:  The people that are taking advantage of the information are of course meeting the industry requirements. They are staying on top of things like CQI-9 or NadCap requirements from a data collection and meeting the customer requests. That is the foundation. I always say that in a lot of cases, that is a big driver for electronic data. But the people that are really taking advantage of that are using that information for operational benefits. Operational can be both from a maintenance standpoint as well as just improving your overall operations.  You’re looking at, “Why do I have downtime of two hours between loads on this particular piece of equipment?” So now, instead of using somebody to go search the shop for, and walk out and get a paper chart, you now have people that can actually evaluate the downtime between loads. You can look at gap times and identify what the issue is. Is it because I don’t have enough fixtures? Is it because I don’t have enough labor? The labor market is tight right now, so you want to use something that is going to provide you with something to maximize efficiency with what you have. Challenges might be your labor or might be your equipment. Are you making the most of your equipment? You can look at that data. You have tons of information. If you can evaluate that, it gives you an opportunity to make better decisions. That is one area.

The other area is, how can you utilize the data and push that out to all your people. Let everybody look at this, but only give them the pieces of information that are important. The maintenance department is going to be interested in maybe the percent output, the current going to the electrical elements, vibration, or water temperature. That information is relevant and if they could isolate that information, they can sit down with their cup of coffee in the morning and they can evaluate this information. Before they have to react to all the firestorms that they have in front of them, maybe they can actually plan for some preventative maintenance activities based off what the data is telling them. The right information to the right person is really critical. The people that are doing this are the ones that are really taking full advantage of the information that they have with a SCADA package.

DG:  Is there someone out there that is actually doing it?

JO:  Yes, absolutely! There is no doubt about it. People are taking resources, and instead of being reactive and trying to find stuff on the shop floor, they are using the system to identify, answer customer needs and then create those operational efficiencies. People absolutely, no doubt, are taking advantage of that. They are looking at shortening time between loads, notifying users when loads are done so they can get the parts out and then put new parts in. This is happening with mobile devices and/or emails so that the right people are notified at the right times.

DG:  Give us the lowdown on what SSI is doing in this area.

The SSI Data Capture

JO:  Our foundation provides us the ability to provide information everywhere. This starts with the sensor and taking that sensor data into a controlling equipment, whether course microprocessor control PLC. But you need to make that readily available so that people can make decisions quickly. Proof of process is one thing of course, but so is giving access to information, whether by mobile device or a messaging system. So we’re taking all of the information that we’ve already done in the past and providing that into the technology that people are utilizing today. We see huge opportunities from being able to go through the existing data that’s there, and then look at better ways to capture data based off the technology that is becoming available, whether it’s how we capture usage of gas or usage of electricity or just process-related data to make sure that the right person is getting the right information.

DG:  Many of the folks reading this article are manufacturers with their own in-house heat treat plants, and I’m guessing that many of them are wondering what they can do to move in this direction. What should these folks do next?

JO:  First step is to do an inventory of the equipment and be realistic about what data you can get out of them, highlight the drivers, meaning what are your business drivers for capturing that information, and then at that point decide if it is just the infrastructure from a data acquisition standpoint or, if you want to get some bigger bang for your buck, maybe you want to make an investment in some equipment that is technology down at each piece of equipment level, to capture that so that you can realize the gains based off of capturing that information.

DG:  If a company wants to move in this direction, must they go cloud-based?

The Cloud and Security

JO:  No, definitely not. The cloud is a tool that allows basically data and information to be stored externally. The reality is a virtual server in many degrees can potentially be a cloud-based system, but it doesn’t have to be. A large number of the installs we have are storing information locally and then transferring data to the cloud for backup recovery.

DG:  Address cloud-based security, if you would.

JO:  It is a huge topic from a security standpoint and I would say that most of the companies that use the SCADA packages are on-premise. That is not all of them, but most of them are. This means that if you are on premise, you have a private network where it is not accessible from anywhere unless you create that tunnel into that private network using virtual private network. That’s what you refer to as on-premise. Then you have cloud-based system, which is really just pushing that information up to a server form which provides access into it. Of course, there is a security aspect regarding accessing that information. A strategy has to be put forth that prevents external access to that information. In many cases, if you decide that you’re going to go to a cloud-based system, you’ve already thought through that and you’ve probably already transitioned some other systems to that. Anyone that is going to a cloud-based system has some security requirements to prevent any illegal or unwanted access.

DG:  Jim, thanks for your time.

JO:  Doug, thank you for having me on Heat Treat Radio. I really appreciate the opportunity. This topic is important to us here at Super Systems. As a technology provider to the industry, we really like to get the word out there about what types of things are coming, whether it’s making data accessible at the hand-held level, or helping make decisions, it is something that is near and dear to our heart and that is because a lot of our customers really find this necessary. I appreciate you spending the time with me and I really look forward to having discussions around this in the future.

That was Jim Oakes of Super Systems Inc. talking about data and how to get the most out of that data. If you’d like to get in touch with Jim, please email me directly at doug@heattreatoday.com and I’ll put you in touch with Jim. Super Systems can be found on the web at supersystems.com.

Suffice it to say, you will be hearing more from Heat Treat Today about data and how to use it more effectively for your business. To see more heat treat technology articles, go to www.heattreattoday.com. We post a new heat treat item, either a technical article or some industry news, every weekday. If you’d like more Heat Treat Radio, simply Google H”eat Treat Radio”. We’re the first thing that pops up. You can also subscribe to Heat Treat Radio on iTunes or SoundCloud.

One last reminder that today’s episode of Heat Treat Radio was underwritten by Dry Coolers. If you have need for any industrial cooling system, give the good people at Dry Coolers a call.  They are on the web at www.drycoolers.com.

This and every other episode of Heat Treat Radio is the sole property of Heat Treat Today and may not be reproduced without express written permission and appropriate attribution from Heat Treat Today. Jonathan Lloyd of Butler, PA, produced and mixed this episode. I am your host, Doug Glenn.  Thanks for listening.

Doug Glenn, Publisher, Heat Treat Today
Doug Glenn, Heat Treat Today publisher and Heat Treat Radio host.

To find other Heat Treat Radio episodes, go to www.heattreattoday.com/radio and look in the list of Heat Treat Radio episodes listed.

Heat Treat Radio #15: Jim Oakes Read More »

Applying PID to Temperature Variances in Vacuum Furnaces

 

Source: Solar Manufacturing

 

Controlling process temperature with accuracy and without extensive operator involvement is a crucial task in the heat treat shop and calls for the use of a temperature controller, which compares the actual temperature to the desired control temperature, also known as the setpoint, and provides an output to a control element. This comparative process relies upon an algorithm, the most commonly used and accepted in the furnace industry being the PID, or Proportional-Integral-Derivative, control.

“This popular controller is used because of its robust performance in a wide range of operating conditions and simplicity of function once understood by the processing operator,” writes Real J. Fradette, a Senior Technical Consultant with Solar Atmospheres, Inc, and the author of “Understanding PID Temperature Control as Applied to Vacuum Furnace Performance” (with William R. Jones, CEO, Solar Atmospheres, Inc, contributing).

The PID algorithm consists of three basic components, Proportional, Integral, and Derivative which are varied to get optimal response. If we were to observe the temperature of the furnace during a heating cycle it would be rare to find the temperature reading to be exactly at set point temperature. The temperature would vary above and below the set point most of the time. What we are concerned about is the rate and amount of variation. This is where PID is applied. ~ Fradette

In this week’s Technical Tuesday, we direct our readers to Fradette’s article at Solar Manufacturing’s website where he and Jones cover the following on PID temperature controllers:

  • Definitions, e.g., Closed Loop System; Proportional (GAIN); Integral (RESET); and Derivative (RATE)
  • Actual operation of a PID temperature controller, including understanding PID dimensions and values; and general rules for manually adjusting PID
  • The art of tuning, a manual
  • Autotuning
  • Tweaking the furnace PID controller
  • and other factors

 

Read more: “Understanding PID Temperature Control as Applied to Vacuum Furnace Performance”

Photo credit

Applying PID to Temperature Variances in Vacuum Furnaces Read More »

Jason Schulze on Understanding AMS 2750E — Standard SAT Description

Dan Bender, Understanding the Short Circuit Current RatingJason Schulze, Conrad Kacsik Instruments, Inc.


This is the second in a series of articles by AMS 2750 expert, Jason Schulze. Don't miss the Q&A section at the bottom of this article and please submit your AMS 2750 questions for Jason to Doug@HeatTreatToday.com.


Introduction

Considering the abundant number of Nadcap heat treat audits performed in a single year, the area receiving the most findings is pyrometry, and within this group, system accuracy testing (SAT) is the third most common finding.

The SAT process has been refined through each revision of AMS2750 (C through E). We’ve seen SAT thermocouple requirements, for example, gradually incorporated into the tables but not within the body of the specification. Also, we’ve seen the definition of a SAT incorporated into revision D within the definitions section; however, with revision E it was added to the body of the specification.

AMS2750E presents three optional methods for performance of SATs that must be implemented; the Standard (or Regular) SAT, the Alternate SAT, and the SAT Waiver. Within this article, we will focus on the Standard SAT process.

Standard SAT Description – AMS 2750E

AMS2750E has defined the Standard SAT as:

An on-site comparison of the instrument/leadwire/sensor readings or values, with the readings or values of a calibrated test instrument/leadwire/sensor to determine if the measured temperature deviations are within applicable requirements. Performed to assure the accuracy of the furnace control and recorder system in each control zone.

Put simply, an SAT is a comparison of two systems: the furnace system (whether control, monitoring, or load) against a test system. It’s important to recognize that the comparison is being made against two systems and not against an instrument or thermocouple alone. Each system is made up of three variables:

  1. the instrument
  2. the lead wire
  3. the sensor

 

image-1

SAT Procedure

There is no general SAT procedure that can be applied for every supplier. Each supplier has their own needs as well as their own mechanical arrangement of thermocouples within their furnace system. The key to conformity is to ensure that, once a method for performing an SAT on a furnace is established, it is documented (i.e., in detail, including photos, if necessary) and repeated each time an SAT is performed. Some requirements to incorporate into your system are:

1) The tip-to-tip distance between the furnace system thermocouple and the test system thermocouple cannot exceed 3 inches.

2) The test thermocouple shall be in the same position/depth as the initial test.

3) The furnace is cycled and maintained at a temperature normally used during production.

4) Each system that makes up the applicable instrumentation type must be tested.

SAT Difference

Many findings arise from suppliers calculating the SAT Difference incorrectly. AMS 2750E states the following as a way to calculate the SAT Difference.

The difference calculated between the reading of the furnace sensor system being tested (sensor, lead wire, and instrument) and the corrected reading of the test sensor system (after test sensor and test instrument correction factors are applied) shall be recorded as the system accuracy test difference. Applicable correction factors shall be applied algebraically.

I’ve highlighted the word “corrected” as it applies to the test instrument systems because this seems to be a source of frequent findings. The furnace system does not get corrected, the test system does get corrected.

 

image-2

As an example, let’s consider a vacuum furnace which has had an SAT performed. The vacuum furnace is designated a Class 3 (±15°F) Type D furnace. Let’s assume no additional furnace thermocouples are employed and we are performing an SAT on the control and recording systems. The readings obtained are below in the picture.

image-3

 

*The example above is not an SAT Certification. It’s an example of how to calculate the SAT Difference in a given situation.
*The example above is not an SAT Certification. It’s an example of how to calculate the SAT Difference in a given situation.

Conclusion

SATs can be difficult depending on the equipment and processes suppliers have. As always, it’s important to receive comprehensive training regarding the specific requirements of System Accuracy Testing as they apply to your facility. There are many particular aspects of SATs that may not have been accounted for in this article. If you have specific questions, please email them to doug@heattreattoday.com, and I will answer them in an upcoming article.

Submit Your Questions

Please feel free to submit your questions, and I will answer appropriately in future articles.

Out next topic will focus on the requirements and execution of an Alternate SAT per AMS2750E, the requirements of AC7102/8 and the Pyrometry Guide.   

 


 

Q/A with Jason Schulz

Q: When calculating the SAT Difference, should I include the correction factors of the furnace sensor?

A: No, the correction factor from the furnace sensor is not to be included in the SAT Difference calculation.

Q: How do I account for an internal (pre-programmed) TUS offset within the controller when calculating the SAT Difference?

A: Internal or electronic TUS offset must be algebraically removed when calculating the SAT Difference.  Below is an example that includes an electronic TUS offset of -2°F.

 

sat-qa

Q: I operate a furnace with 2 load sensors. One of them is used to signal the start and end of each soak cycle, the other is reference only. Do I have to perform an SAT on the load thermocouple I use as a reference only thermocouple?

A: Any thermocouple that is not used as product acceptance may be deemed reference only and is not subject to the SAT requirements of AMS2750E. Nadcap requires that the reference only thermocouples be accounted for in internal procedures.

Q: When performing my bi-weekly SAT, I get a difference of +2.6°F on one test and two weeks later I get a difference of -3°F; this constitutes a spread (within two weeks) of 5.6°F. Would this be cause for SAT failure?

A: According to AMS2750E and Nadcap, no, this would not constitute a failed SAT, though is something to be cautious of. This type of shift in SAT results does reflect some sort of change or degradation of the system being tested. A well-established tack, in this case, is to plot SAT results as part of an SPC (statistical process control) program which will govern future replacement of system thermocouples and/or leadwire (in the case when large difference is SAT results over a pre-determined amount). A documented SPC system for SAT results would also satisfy the requirements of AC7102/8(NA) page 2, paragraph 3.12.

Jason Schulze on Understanding AMS 2750E — Standard SAT Description Read More »

Should You Replace Your Aging Heat Treat Technology?

Consideration #1:  Sometimes, Hardware and Technology Forces You to Change Your Software.

Nothing is more frustrating than having to change your software systems because of unexpected developments that have occurred in the underlying and supporting technology.  For example, think about the enormous amount of software applications that ran fine on Windows XP, but now need to be replaced, or undergo a complete remake, because Windows XP is not supported by Microsoft anymore and has now become a risk to the continuity of your business.

Is your heat treat department running an outdated system, a customized software application, or a conglomerate of pieced-together, homegrown apps still limping along? If so, now is the time to look at making that much needed transition and move to a more flexible environment, preferably one that remains on the cutting edge of technology advancements with continuous monthly updates and new software releases.

Consideration #2: Benefits of Using Browser-based Software

A browser-based application is any application that uses a website application as the “front end”, or interface.  A browser-based application can be run in the cloud (SaaS), OR it could be run on your own internal (intranet) server(s). Users access the application from any computer (or mobile device if the software can handle it) connected to the Internet using a modern standard browser, instead of using an application that must be installed on each individual local computer or device.  As an example, Microsoft Word is a common word-processing application.  Google Docs is also a word-processing application, but all the functions are performed using any web browser instead of using software that has been installed directly onto their computer.

Web applications are easier to upgrade and do not require to be completely re-installed if you replace your computer system.  They still need to be kept up-to-date in order to cater to any new internet web browser technologies and information security threats.  With browser-based applications, users access the system via a uniform environment—the web browser; however, the user interaction with the application needs to be thoroughly tested on different web browsers so the system will function properly using any of the popular modern browsers.

Unlike traditional applications, web systems are accessible anytime, anywhere, via a computer or handheld device that has an Internet connection, putting the end user in charge of where and when they access the application.  Someone on vacation can quickly log into the system from anywhere in the world to see what is happening back at the shop.

Using Internet technologies based on industry-wide standards, it’s possible to achieve a far greater level of interoperability between applications than with isolated desktop systems. For example, it is much easier to integrate two browser-based systems than it is to get two proprietary systems to talk to each other. Browser-based architecture makes it possible to rapidly integrate enterprise systems, improving workflow and other business processes. Installation and maintenance becomes less complicated. Once a new version or upgrade is installed on the host server, all users can access it immediately. There is no need to upgrade each device with the new version of the application.

Speaking of data security and confidentiality of business-sensitive information, browser-based applications are typically deployed on dedicated servers, which are monitored and maintained by experienced server administrators. This is far more effective than monitoring many client computers or handheld devices, as is the case with new desktop applications that requires each device to be ‘touched’ every time there is a software change or new upgrade release.

Consideration #3: Most Businesses Know They Need to Update Their System

Various surveys and studies that have been done, and found that the majority of organizations polled know they need to, or should, update their enterprise software systems.  However, they have no plans to do so in the near future, mainly because of cost. This reason is conveniently used a lot, but it just doesn’t add up to be a legitimate reason.  Many businesses have no problem spending millions of dollars on new equipment, but don’t want to spend a few thousand dollars on new software that will get them the information they desperately need to make better decisions in the management of the business, and to help their employees be more efficient with fewer errors.  They are swimming in data, but starving for the information they need, usually because the data is held in disjointed silos across the enterprise…and it isn’t consolidated for better business metrics and knowledge-based reporting.

The problem with not wanting to spend money on software is that it will eventually lead to higher costs, both now and down the road, because the current system(s) require work-arounds, duplication of effort, data redundancy, etc.  There is also the possibility of decreased revenue and business income because the competition is better prepared, and has more to offer their customers. Budgets are important, but they shouldn’t be the only consideration when evaluating the need to modernize your software systems. These considerations should play a large part in the decision-making process.

Nothing Stays the Same:  One thing about technology is that it is always changing.  Your business is always changing too, and your software needs to keep up with your business…not be a hindrance to it. It is essential for nearly all organizations to stay in competition, but aged software applications can be as dangerous as not having an application at all. Not only do older systems not have the functionality that newer systems do, they become a competitive disadvantage and are a detrimental hindrance to achieving the goals/objectives of the business.

Winning Is About Staying in the Fight: “If it ain’t broke, don’t fix it.” This common phrase is used in reference to many things, even software applications. However, consider this: how detrimental can a ‘not-broke’ system be that is not efficient and causes additional time and effort from the workforce to do their day-to-day jobs?  Although modernizing your existing software application system might seem costly, if it is inefficient, you need to answer the question: How much is the existing system costing me?

It is costing you. Inefficiencies are expensive. That is the reason so many manufacturing organizations are working to become lean organizations. That is the reason that technology exists—to improve inefficiencies. Don’t be fooled. Just because the system was the most efficient thing on the market 10 to 15 years ago doesn’t guarantee that it’s the most efficient solution for your organization today. Take the time to study how much efficiency a new system could create before making the decision to modernize. If you still think modernization is too expensive, add the cost of that inefficiency to your operating budget and look again.

Agedness = Feebleness:  One last consideration when determining whether modernizing your software is the right option for the business, is the security of that system. You want your business data and confidential customer information to be secure and readily accessible at the same time. Just as the human body becomes feeble over time, so does the security of your application systems.

In an age when security is paramount to survival, your software applications could actually be putting your organization at risk. There have been multiple companies that have had their corporate servers hacked, sensitive data was breached, and in some cases they lost many months of valuable company data, not to mention the public humiliation, loss of customers’ trust, legal ramifications, and very expensive recovery efforts.  There’s a new data breach or malicious hack identified in the news almost weekly.

With that said, can you really afford to ignore that your older application systems and databases might be less secure than a new system?

Consideration #4: The Secluded Silo Syndrome

Many companies have purchased various software products over the years to meet some type of business or operational need. These purchases inevitably resulted in creating silos in different parts of the organization where it is hard to keep information (some/much of it duplicated) synchronized. Old software and old technology also means that newly hired employees have no experience in that technology and require extensive training just to use the technology to carry out their duties. Also, reporting is simply so much work that the business doesn’t get the information they want when they need it.

Some businesses have considered whether to spend money to develop something like a business intelligence layer over the top of the existing systems, or whether to spend their dollars to upgrade to a modern software system where all the data is stored in one place.  Past Chairman and CEO of General Electric, Jack Welch, once said that “If the rate of change on the outside exceeds the rate of change on the inside, the end is near.”

A key consideration that is faced by companies who find themselves in this position, is whether their existing software systems can effectively scale up to accommodate the planned growth of the business, or will the limitations of these old systems hinder their growth? Given the rate of change in business today, there will come a point at which the existing collection of systems just can’t accommodate the organization’s needs.

For example, can those applications support new governmental regulations, or any necessary certification/accreditation requirements? With natural attrition, there is a growing gap between new and existing employees. Does a viable business really want to invest in training new employees on obsolete applications?  Usually, this is a negative…therefore, the question becomes not “Should we change?”, but “When should we change?”

Everyone understands that there is a cost of replacing outdated systems and software tools, and this is usually the main reason that management gives to delay any upgrades.  But, remember this; you’re also paying a daily price by continuing to rely on those clunkers, and possibly putting the business at undue risk.

Consideration #5: Acknowledge the Objections

Because old software or hardware is still in use, senior management personnel often see no need to replace it.  They think, “If it ain’t broke, don’t fix it.” There are many phrases that management teams have used over the years when defending the continuing use of old software and continuing to delay what they know in their heart really needs to be done, based on the feedback (and sometimes complaints) they are getting from their ‘down in the trenches folks’.  Some of these include:

  • The old system works fine.
  • We’ve always done it that way.
  • New technology costs lots of money.
  • New software will require a big investment in training time as well as money.
  • New software may require some customizations.
  • We’re busy; we don’t have time for this.
  • We’re slow; we don’t have the money for this (please ignore the new Escalade in the parking lot).

Everyone understands that management is simply trying to limit spending in any way it can. However, think about this; if you never swapped out your old equipment, and upgraded to better, faster, more efficient, cost cutting machines, think about the problems you’d still be trying to solve!  The use of software is a very critical component of your business; so, it’s really not a question of if you should switch to better, faster, more efficient software, but when.

Consideration #6: Dangers of the Do-nothing Approach

For the sake of argument, let’s say that your company decides to change nothing, and continues to use the old software or hardware forever. Take a moment to think about what would happen…

At some point in time, your software will not work on newer machines and modern operating systems and browsers, because they are constantly changing.  Here’s something on a more personal note for you to consider; if you oversee your company’s technology investment you need to understand that when the old computer equipment and/or old software systems fail, and the business is ‘dead in the water’ without it, management is going to blame you.  So, why wait for it to fail when you can be proactive, rather than reactive; start planning now and mitigate the risk.  Assistance is available to help you develop a strategy for reporting on the risk of the situation, and help senior management begin to study the possibility of replacing old technology based on effective risk management techniques.

At some point — and you never know when — some sort of system update or service pack may cause your old software to simply stop working.  Also, we have heard from other companies that their customers started complaining because they couldn’t get them the right data that they needed in a timely fashion in order to continue to do business with them. Why risk losing a good customer?  Even if time stands still at your company, it’s still moving right along for your customers (and your competitors).

Summing Up

The challenge now falls upon us, as business owners, to estimate the financial risk that each of our companies are taking when we choose to work with outdated computers, peripherals, operating systems, and the software we use to collect the day-to-day data that we need to run the business and stay competitive. By explaining the cost of doing nothing to the rest of your management team, you may be able to persuade them to start doing something instead.

 

Author
Ron Beltz
Director of Sales – Strategic Accounts
THROUGHPUT I BLUESTREAK

Should You Replace Your Aging Heat Treat Technology? Read More »