Simulation of thermal pulse evolution during laser debonding
Abstract
Temporary wafer bonding and debonding techniques are becoming ubiquitous in the world of 2.5D and 3D technology. After a decade of research and development, two room-temperature debonding techniques have emerged as industry front-runners: laser-assisted debonding, and mechanical peeling, each having its particular strength. Mechanical debonding depends on proper engineering of the relative strengths of adhesion between the handler, the release layer, the adhesive and the device wafer. Once separation is initiated at a wafer edge, the handler is peeled away leaving the adhesive layer on the either the handler or the device wafer depending on the location of the release layer. Ultraviolet (UV) laser ablation using an excimer source in combination with an x-y scanning stage, or a solid-state laser paired with an optical scanner, has likewise been shown to be an effective debonding technique. In the case of laser-assisted debonding, questions arise as to the effectiveness of absorption in the material being ablated, the magnitude of the thermal pulse generated at the handler-release layer interface and its evolution as it transits the adhesive layer to reach the device wafer surface. In this paper, we apply time-dependent thermal finite element modeling to predict the duration of heating and maximum temperature excursion during the thermal evolution of the bonding structure including the silicon BEOL and device regions. The laser input is modeled as a distributed thermal pulse load determined by the input pulse energy, spatial distribution and laser absorption depth. We find that for optimized release layer, adhesive layer, laser wavelength and power level the majority of the heat flux flows into the handler wafer and heating of the device layers can be well below device limits. The effects of laser absorption depth in the release and adhesive layers, their thickness and material properties will be discussed.