Description Qty Item Price  
your basket is empty

sub total£0.00  
£ $
proceed to checkout

BENCHmark January 2005

Data Management - the Theory in Practice

BENCHmark January 2005

How do we know that our answers are correct? What level of confidence can we place in our results? Is our analysis “fit for purpose”?

The increasingly proliferate use of simulation tools such as Finite Element Analysis (FEA) and Computational Fluid Dynamics (CFD) is one of the engineering success stories of the last few decades. All those who are involved with the technology can – justifiably – claim to have developed products with improved quality, increased safety and/or reduced costs. Yet these most fundamental of questions can still cause difficulty.

NAFEMS has always endeavoured to play a leading role in improving the confidence that can be placed in simulation results. In the early years, this focused on developing internationally recognised benchmarks that software developers could use to verify that their algorithms were correctly coded.

In more recent years, although new benchmarks are still being actively developed by the organisation, the emphasis has shifted more towards demonstrating confidence in the entire analysis process. For example, the Knowledge Base article in this issue refers to the SAFESA procedure with which NAFEMS was involved.

However, there are no easy – or complete – answers. It remains a highly topical subject. One of the earliest posts on the recently formed North American discussion group raised precisely this issue, and quickly generated a flurry of replies.

Many of the meetings of the FENet project have spent a considerable amount of time discussing “fitness for purpose” and related topics. For example, the meeting in Glasgow had a lengthy session talking about the definitions of Verification and Validation as well as the responsibilities of both the software developer and the analyst. It was interesting to note that a reasonably representative selection of experts found it difficult even to reach a consensus on defining the terms Verification and Validation. So much so, in fact, that one of the first tasks that the newly formed Analysis Management Working Group has set itself is to agree some standard definitions. (Contributions to this debate are welcome at the Analysis Management Working Group section of the website).

Tim MorrisOver the coming years, much time will be spent debating these topics at NAFEMS events and developing material to help engineers to provide answers to the questions raised.

Tim Morris Chief Operating Officer
January 2005 

 Articles are available to NAFEMS Members to download below. 

Knowledge Base 006 - Commercial Analysis Validation


CAE Data Management at Audi AG

AUDI reveal how they have turned the theory into practice.

Simulation of the Effects of Condensation Induced Waterhammer

Fluid Structure Interaction and correlation with test in steam systems

Childs Play

Meeting Stringent Cost, Safety and Quality Issues - Simulation of the manufacturing process in the most juvenile products

Simulation of the Locking Mechanism of an Injection-Moulding Clamp Unit

Multi-Body Simulation, Fluid Power Simulation, and Finite Element Analysis working together

CFD for Fun - Fluid Dynamics used in Soapbox Racing

Other issues you might be interested in: