Completed chips coming in from the foundry are topic to a battery of checks. For these destined for essential methods in vehicles, these checks are notably in depth and might add 5 to 10 % to the price of a chip. However do you actually need to do each single check?
Engineers at NXP have developed a machine-learning algorithm that learns the patterns of check outcomes and figures out the subset of checks which can be actually wanted and those who they may safely do with out. The NXP engineers described the method on the IEEE Worldwide Take a look at Convention in San Diego final week.
NXP makes all kinds of chips with advanced circuitry and superior chip-making know-how, together with inverters for EV motors, audio chips for shopper electronics, and key-fob transponders to safe your automotive. These chips are examined with totally different alerts at totally different voltages and at totally different temperatures in a check course of referred to as continue-on-fail. In that course of, chips are examined in teams and are all subjected to the entire battery, even when some elements fail among the checks alongside the best way.
Chips had been topic to between 41 and 164 checks, and the algorithm was in a position to advocate eradicating 42 to 74 % of these checks.
“We have now to make sure stringent high quality necessities within the subject, so we have now to do numerous testing,” says Mehul Shroff, an NXP Fellow who led the analysis. However with a lot of the particular manufacturing and packaging of chips outsourced to different corporations, testing is without doubt one of the few knobs most chip corporations can flip to manage prices. “What we had been making an attempt to do right here is provide you with a approach to cut back check value in a manner that was statistically rigorous and gave us good outcomes with out compromising subject high quality.”
A Take a look at Recommender System
Shroff says the issue has sure similarities to the machine learning-based recommender methods utilized in e-commerce. “We took the idea from the retail world, the place an information analyst can have a look at receipts and see what objects persons are shopping for collectively,” he says. “As a substitute of a transaction receipt, we have now a singular half identifier and as a substitute of the objects {that a} shopper would buy, we have now a listing of failing checks.”
The NXP algorithm then found which checks fail collectively. After all, what’s at stake for whether or not a purchaser of bread will wish to purchase butter is sort of totally different from whether or not a check of an automotive half at a specific temperature means different checks don’t have to be achieved. “We have to have 100% or close to 100% certainty,” Shroff says. “We function in a unique house with respect to statistical rigor in comparison with the retail world, however it’s borrowing the identical idea.”
As rigorous because the outcomes are, Shroff says that they shouldn’t be relied upon on their very own. It’s important to “make certain it is smart from engineering perspective and you could perceive it in technical phrases,” he says. “Solely then, take away the check.”
Shroff and his colleagues analyzed knowledge obtained from testing seven microcontrollers and purposes processors constructed utilizing superior chipmaking processes. Relying on which chip was concerned, they had been topic to between 41 and 164 checks, and the algorithm was in a position to advocate eradicating 42 to 74 % of these checks. Extending the evaluation to knowledge from different forms of chips led to a good wider vary of alternatives to trim testing.
The algorithm is a pilot mission for now, and the NXP workforce is seeking to broaden it to a broader set of elements, cut back the computational overhead, and make it simpler to make use of.
From Your Web site Articles
Associated Articles Across the Net