Regulations Specs & Testing

Machine Learning Boosts Oil Analysis


Artificial intelligence and machine learning may conjure dystopian visions from the seemingly infinite series of Terminator movies. Instead, Bureau Veritas oil analysis business has found these emerging technologies can accurately evaluate high volumes of lubricating oil samples faster, allowing more time for its human analysts to focus on critical samples that most need their attention.

This can help businesses grow without adding employees or overloading their existing workforce.

Thats the message Cary Forgeron, Bureau Veritas North American director for oil condition monitoring, shared at the ICIS-ELGI North American Industrial Lubricants Congress in Chicago. Forgeron has spent more than 15 years working with large industrial clients like Dupont and Archer Daniels Midland developing their oil analysis programs.

Forgeron presented a case study of predictive analytics, explaining how artificial intelligence can transform oil analysis. Artificial intelligence takes so-called big data to the next step by allowing computers to learn how to better analyze data as more information is fed into the program.

The challenge for businesses that arent technology giants, he said, is learning how to best take advantage of whats been termed the Digital Revolution. What we found out through this project is you dont have to be a Google or Microsoft to take advantage of whats going on, he told attendees at the September meeting.

Bureau Veritas laboratories perform oil analysis, also called oil condition monitoring. The companys customers are typically end users that work with industrial plants and high-efficiency equipment. They send samples of their lubricants, such as greases and metalworking fluids, to the laboratory.

In our laboratory we perform a battery of tests on those, Forgeron said. The test results are analyzed by a data analyst, and the results are reported back to the customer, along with some maintenance recommendations.

Growing Fast

Forgeron was part of Analysts Inc., which was acquired by Bureau Veritas in 2014. Prior to the acquisition, we were doing about one million samples a year in four laboratories. Then we had 15 data analysts that reviewed all of those samples, flagged test results and put comments and recommendations on them, he recalled.

Since 2014, the oil analysis operation has expanded to 17 laboratories globally, and the company has increased the number of samples examined per month by about 25,000. We realized this business wasnt sustainable with our 15 analysts here in the [United States] supporting our operations in Europe, Asia, the Middle East, Australia, etc., he said. Time zone differences created limitations, making it difficult to ensure analysts returned completed results within 24 hours. Workload was also a challenge.

The process typically started with oil samples arriving at the Bureau Veritas laboratory. The samples were tested, and data analysts examined the test results. They deemed a sample normal, meaning no action was needed; abnormal, which means some type of corrective action is necessary, but the customer has time to plan the maintenance; or critical, which means instructing the customer to shut down the equipment for immediate maintenance.

When samples come into the laboratory, we have no idea if that samples normal or critical at that time, explained Forgeron. The classification wasnt assigned until a data analyst was able to evaluate test results.

Given the number of samples received, the company discovered that analysts could spend only 3.5 minutes per sample in order to keep up with the production pace. So thats reviewing all the results, looking at the trends, putting the flagging on the results and writing the recommendations, Forgeron explained. You can imagine theres some human error that happens there.

He noted that with 15 individual analysts writing recommendations, some inconsistencies may occur. One analyst may deem a sample abnormal, while another analyst might mark it as critical. Different analysts may also use slightly different terminology in their recommendations.

In working with our customers, what we realized is they really dont care about the normal samples, he said. But analysts were spending the same amount of time on normal samples as on abnormal and critical samples that needed more attention.

The company saw the opportunity for digitization to assist in the oil sample analysis process.

Bureau Veritas goal was to implement a system smart enough to determine which samples were normal and push them out the door, freeing up time for analysts to spend with abnormal and critical samples and to interact with customers, Forgeron said.

In its case study, the company discovered that the use of artificial intelligence enabled its human analysts to spend an average of 15 minutes on each sample. Thats essentially five times longer per sample before it goes out to the customer, he pointed out.

Improving efficiency was important to the company-not replacing humans with machines. I think when a lot of people hear about artificial intelligence or automation, they think of people losing jobs, Forgeron said. Thats not really the case in this.

Consistency was another benefit of digitalization. We needed these recommendations and the interpretations to be consistently all the same across all laboratories across the globe, he explained.

Machine Learning

The oil analysis business requested funds to wade deeper into artificial intelligence and see whether the concept would pan out. Could machines act like one of the companys data analysts, who are engineers and chemists with an average of 20 years of industry experience? How do you reproduce that into a machine? How do you get those minds into the system? Forgeron wondered.

The team on the project included only data analysts-no one from the companys information technology department. We did this as a business and operations case to figure it out, and then did the vendor selection, he recalled. We wound up choosing Microsoft in the process and learning a lot from them. Theyre the subject matter experts and were able to bring solutions to our business that we didnt really think of.

Artificial intelligence and machine learning can sound intimidating to those who dont work in those fields, but their functions are fairly accessible. Machine learning is getting computers to learn and act like humans and improve that learning over time, Forgeron explained. What we want the computer to do is start to learn. But weve got to teach it first.

Machine learning requires teaching the computer, developing models for learning, then continually feeding in data. This learning loop includes a confidence factor. [If] its 90 percent confidence that this is normal, we can release it. If its not 90 percent, it comes back to our data analysts. They review it, and it feeds into the machine.

Bureau Veritas data set for analyzing oil samples is quite large. The company may process up to 1.2 million samples in a year, including elemental analysis for each test.

Were looking at 27 or 28 individual test results per sample, Forgeron said, adding up to more than 28 million data points. Start to add in OEM makes and models and fluid types, brands, manufacturers and grades-you can see these data points increase drastically.

Microsoft worked with Bureau Veritas to map out the oil analysis process so the tech company could gain an understanding of what the data meant. What does 10 parts per million of zinc mean? What does 10 parts per million of iron mean? Forgeron cited as examples. Just examining the data took six months.

The next step was data preparation, recognizing that-without training and learning-artificial intelligence is not so intelligent when it comes to, for example, recognizing that subtly different spellings and iterations of a companys name mean the same thing.

You have to go through and scrub that data, he said. Thats where you start to build some of these models. If you teach the machine that those are the same or very similar, the machine can start to learn or see that in the future.

The company then fed the machine large sets of data from completed tests that had already been analyzed by hand. So we knew what the test results were, Forgeron explained. We knew what the outcome should be before we put it in the machine. This data was used to test the computers ability to learn how to return the correct analysis. Round one, it was 3 percent [accurate]. Round two was 25 percent. Round three was close to about 80 percent. So it learned very quickly, he reported.

Once test result meanings were established, the company had to teach the machine how to return useful recommendations. As an example, a report may include a paragraph detailing bad overall equipment condition with low oil viscosity caused by fuel dilution. The recommendation might suggest draining the oil and resampling in six months.

That makes perfect sense to a data analyst that sees the problem, Forgeron observed. But how does the machine know to take all of those recommendations, use all of those recommendations on this report and then order them in a manner thats meaningful to the user looking at the results? What made this project complex was all the different things we needed to figure out-the results severity, recommendations and also the data.

All the while, the artificial intelligence continued to learn and evolve. Forgeron explained that part of testing oil viscosity is setting limits within 15 percent of a center point. Thats good to use, but what were finding is newer oils coming out might not be at that center point, Forgeron explained.

Newer oils start at the higher end or the lower end, so those set limits arent really working where everybody was trained to look from that middle point. This AI system is actually looking at that, seeing it, and adjusting those limits-its already starting to learn with that, he highlighted.

Even though Forgeron and his colleagues work in laboratories for customers, its now become all about the data. I dont talk about test tubes and needles to customers anymore, he said. Im talking about the data-what they can do with the data and what were doing with the data to improve their business.

Related Topics

Regulations Specs & Testing