Mario Guagliano –
An important part of the engineering activity has always been connected with the capability of defining models able to foresee systems’ behaviour in their operational conditions. Concerning the structural design, for instance, calculation models are oriented to the forecast of stresses, deformations and displacements under the action of applied loads in operation, to compare then these magnitudes with the admissible values according to the resistance of elements and materials and the system functions. Analogue issues can be valid for the other aspects of the industrial design.
The difficulty of developing reliable, and at the same time relatively simple, calculation models, able to be a valid instrument for the engineer’s activity, over the years has coped with the mathematical complexity of the phenomena to be treated. This means that it was possible to obtain a suitable solution only for some simple cases and that, generally, it was necessary to turn to generous empiric corrective coefficients “to adapt” the analytic solution to the case at stake.
The advent of computer technologies and processors’ growing calculation power, combined with the development of more and more “user-friendly” simulation software, able to deal with problems of very different nature and of high complexity, has entrusted developers with the deep mathematics knowledge that is behind the software, leaving the model definition and the analysis of results to users.
This approach has implied a great refinement of the results of the structure design, permitting to satisfy the ever-rising demands for lightness, reliability, performance and energy efficiency that characterize the current industrial panorama on a world scale.
What I have just written might suggest that we have still little to do and that the use of the simulation as engineering instrument is now mature, to the extent that it is not licit to expect great innovations in the next future.
Actually, several challenges are still unaccomplished. The need of increasingly integrating the various simulation instruments, for instance, in order to make the models developed with some purposes and determinate formats usable also for different analyses and with software using different logics. How many times, translating a model from one software to another, are we compelled to recall and to modify the initial model to make it feasible for the new analysis? How many times do we lose information, sometimes realizing it afterwards, owing to the scarce compatibility of the various “neutral formats”? All that, despite the great strides made, still constitutes one of the critical aspects of the design process and of the so-called “Product Data Management”, or PLM.
Another important aspect worth underlining is the training of engineers/analysts or, in other words, of the professional figure who works at the simulation: nowadays we simulate complex phenomena “with mouse clicks”, without needing a deep understanding of the phenomenon we are simulating. On one hand, this makes important operators’ correct training and on the other hand the development of instruments able to warn and to guide operators to a correct result analysis.
In the light of what said, some readers might draw the opposite conclusion with regard to the one just outlined, i.e. times are not mature, yet, for a massive and integrated simulation use in design phases. No, absolutely! The simulation is already fundamental and irremissible part of the project. Much has been implemented in the directions I have indicated and a lot will be done in the next future. The invitation to readers is they should not miss the special issue dedicated to simulation enclosed in this number of the review since it encompasses an updated survey of simulation instruments and applications and it will supply useful indications about the next evolutions expected in this sector.