Simulated sample heating from a nanofocused X-ray beam
Forskningsoutput: Tidskriftsbidrag › Artikel i vetenskaplig tidskrift
Recent developments in synchrotron brilliance and X-ray optics are pushing the flux density in nanofocusing experiments to unprecedented levels, which increases the risk of different types of radiation damage. The effect of X-ray induced sample heating has been investigated using time-resolved and steady-state three-dimensional finite-element modelling of representative nanostructures. Simulations of a semiconductor nanowire indicate that the heat generated by X-ray absorption is efficiently transported within the nanowire, and that the temperature becomes homogeneous after about 5ns. The most important channel for heat loss is conduction to the substrate, where the heat transfer coefficient and the interfacial area are limiting the heat transport. While convective heat transfer to air is significant, the thermal radiation is negligible. The steady-state average temperature in the nanowire is 8K above room temperature at the reference parameters. In the absence of heat transfer to the substrate, the temperature increase at the same flux reaches 55K in air and far beyond the melting temperature in vacuum. Reducing the size of the X-ray focus at constant flux only increases the maximum temperature marginally. These results suggest that the key strategy for reducing the X-ray induced heating is to improve the heat transfer to the surrounding.Time-resolved finite-element modelling is used to study the sample heating from intense X-ray irradiation.
|Enheter & grupper|
Ämnesklassifikation (UKÄ) – OBLIGATORISK
|Tidskrift||Journal of Synchrotron Radiation|
|Status||Published - 2017 sep 1|
|Peer review utförd||Ja|