Multiscale Modeling is a broadly used term to describe any instance where a physical problem is solved by capturing a system’s behavior and important features at multiple scales, particularly multiple spatial and(or) temporal scales. For instance, the picture below is a temporal multiscale representation of the origins of life.
What is considered a "composite" is always changing. Just as there is no single definition, there is also no single analytical method that can safely predict their dynamic behavior. The same way you can’t obtain ideal performance by using a single material throughout an entire car, you can’t expect to use a single analytical method to predict the behavior of all composites.
Rule of Mixtures is probably the most known, and widespread method of estimating composite properties. Its notoriety in composite design circles is also its main problem: Rule of Mixtures has been over used, and applied to cases that do not even come close to respecting its original, simplifying assumptions. If you wish to trust your analysis, it is essential to find out when it is OK, and (more importantly) NOT OK to use Rule of Mixtures. This article will describe what this rule really says, and will show some consequences of abusing this “rule of thumb” for composite behavior.
In 1899, William Ernest Metzger helped organize the first Detroit Auto Show, and since 1907 the show has been running annually.
In our last blog, we talked about the importance of learning from failure in materials testing. For better or worse, theoreticians have, in some ways, tried taken the burden of “learning from failure” off the plate of the common engineer. Instead, they try to capture the insights gained from failure into flexible analytical theories; theories that (theoretically) allow us to predict a parts behavior, without knowing anything more than some material properties and part dimensions. In computer science, this is known as abstraction.
The question is, Can composite failure theories sufficiently abstract all the nuance out of composite design? Do you need to understand the origins of a failure theory in order to use it properly?
Collectively evaluating individual failure theories:
If you are like us, you spend a great deal of time keeping up on the latest and greatest publications in your respective fields. When we find a paper we really like, especially related to the characterization of a novel materials, we like to simulate the findings in MultiMech to compare our code's predictions. This serves as a way for us to stay in touch with our potential users and to ensure that our tools continue to provide accurate results.
Topics: Composites Engineering
As was mentioned in Part I, the history of Finite Element Analysis is deeply intertwined with the evolution of computing. It seems only fitting that the FEA software used to design the world's most cutting-edge products should have the most cutting edged computational techniques at its disposal.
From the early punched days of the 60's through the 2000's,FEA companies have found unique ways to take advantage of the ever changing computer landscape.
GUIs - "1984 won't be like 1984:
1984 - The Apple "Lisa" was released. Named after Steve Job's daughter, the computer would be a commercial flop, but would pave the way for the Graphical User Interface and the industry changing, Macintosh.
1985 - The same year that Microsoft unveiled the Windows OS, AutoCAD 2 was released. It was designed to run on "microcomputers", including two of the new 16-bit systems, the Victor 9000 and the IBM Personal Computer (PC). This version consisted of over 100,000 line of C code and had a list price of $2,000.
1985 - Altair Engineering was founded in a garage in Detroit, MI. Their first product was HyperMesh, followed by the award-winning FE based topology optimization tool, OptiStruct. A product they would later buy, the RADIOSS Finite Element solver, required 20 hours to solve a 20 K element crash simulation in 1987. If you fast forward to 2013, RADIOSS is able to parallelize a 15 million element crash simulation to 128 cores and see results in 5 hours. That represents a nearly 4000% increase in computational power. Most of this gain, however, can be accredited to the doubling of computational speed every 18 months.
1991 - NEi Software was founded as Noran Engineering, Inc. Their product, NEi Nastran was a spinoff of the original MSC - NASA codebase, but with a GUI and improved performance.
Topics: Finite Element Analysis