‘Rubbish In -> Rubbish Out’ is a well accepted fact in the world of simulation. Regardless of the technical capabilities of your thermal simulation tool, the accuracy of prediction will always be tightly coupled to the accuracy of the input data. The most accurate thermal IC package model is a so called ‘detailed’ model where all the internal construction is represented explicitly in terms of the geometric sizes and material properties. Certain parts of the package construction are easily measurable and well characterised. Other parts much less so, especially when it comes to highly resistive thermal interface layers, the misfortune being that such layers offer the largest thermal bottleneck to the heat flow in the package and thus most sensitise the resulting operating junction temperature (and its prediction).

Short of destructing a physical sample of the package, studying it with a SEM and characterising the material properties of its constituent parts (if you have enough volume of them to achieve even that!) what choices do you have?

The structure function is shown graphically as a graph of thermal resistance vs. thermal capacitance the heat experiences as it travels from the junction (origin) to the surrounding ambient.
Now consider doing the same thing numerically in FloTHERM. Take your detailed IC package model, simulate it’s thermal response to a change in its power dissipation and derive a numerical structure function. Compare that with the ‘true’ experimentally derived curve and you can identify exactly where the numerical model is erroneous … and by how much.

Of course this detailed model calibration process wouldn’t be required if you knew all the input data to an acceptable level of detail. Package suppliers would be much closer to this end point than end users who (in the absence of being provided with a detailed model from their vendor) would have no other course of action than attempt to reverse engineer such a calibrated model themselves. Either way, both sides of the supply chain would benefit from such a calibration methodology.
If this isn’t done there is increased risk of: inaccurate input data to a simulation, inaccurate temperature prediction, bad thermal design decisions and costly business failures. No one wants that.
20th October 2011. Ross-on-Wye







