- 8,690 visits since January 2015
Optoelectronic Device NewsMy Tweets
Connecting Theory and Practice in Optoelectronics
Computer simulations of the real world can provide valuable insight but they may also produce unrealistic and misleading results. We need to recognize possible pitfalls if we want to avoid them. All simulations are based on mathematical models. The form of such models varies, from analytical formulas to complex equation systems. However, models always simplify the real world and create a virtual reality in which quite unreal things can happen. Pitfalls loom at all stages of a simulation project: Read more of this post
It’s time again to reflect on my peer review experience over the past year. Supported by the availability of high-end commercial software, the number of journal paper submissions on optoelectronic device simulation keeps rising. However, authors often seem to view such software as magic tool that instantaneously delivers realistic results. Mathematical models always simplify reality. But how simple is too simple? Some papers don’t even discuss the underlying theory. There are different levels of simplification possible, which are all based on specific assumptions. Certain assumptions may be inappropriate in the given situation. That is why high-end software packages offer some alternative modeling approaches and let the user decide. In other words, the user should have a detailed understanding of internal device physics and of the models provided by the software.
However, this is only the first step of a successful simulation strategy. The next step is the evaluation of material parameters used in the software. Initial simulation results are typically far off measured characteristics because key parameters are inaccurate. Literature values are quite scattered in some cases. If crucial parameters cannot be measured directly on the device, they should be varied in the simulation until quantitative agreement with measurements is achieved. The model itself may be inadequate if such effort fails or if the fit value is outside the published range. On the other hand, competing models could deliver nearly identical results (see picture) so that more decisive measurements are needed. Such calibration process is often difficult and time-consuming, but in my view, it is the only way to accomplish realistic simulations. Otherwise, calculated results are unreliable and may lead to wrong conclusions. Read more of this post
Whenever you try to answer a research question by using numerical simulation, you start by developing a (or reusing an already developed) mathematical model. Thereafter, you are developing (or reusing already developed) software to perform your numerical simulation. This produces data that you are now analysing and visualising to interpret the discovered results. Finally, you might want to write it all down in an article and publish it on the arXiv and/or in a scientific journal. In this published form the results consist – apart from a couple of figures or tables – mainly of text. In most cases mathematical models, software, data, visualisations and so on are not or not fully shown. This makes it difficult for editors, reviewers and readers alike to fully grasp the research and its results, see e.g. How to get your simulation paper accepted. Moreover, it makes it difficult to validate and in many cases impossible to reproduce the results.
In a time when scholarly publication was limited to printed journals and books it was simply not feasible to provide long rows of numbers not to mention interactive 3D figures or a moving series of pictures. However, with the advent of the digital age and its easy accessible and easy to use infrastructures and tools there is no excuse for not publishing the full research story – and that does not only consist of plain text.