Posted on 28 November, 2022
Since the advent of mainstream computing, simulations have been used by many industries to speed the pace of development and innovation, as well as cut down on production errors and costs. Simulations can cover everything from stock market trends to flows of liquid within a nuclear reactor. To be effective, a simulation must match as closely as possible that which it is trying to simulate. The problem with this, is that the more data you add to a simulation, the more compute power is required to complete the simulation. Given the problem space involved in most modern simulations, HPC hardware is now the only reasonable way to get usable results in a timely manner.
Historically, simulations were performed using a linear, brute-force approach, to solve problems. Nowadays, as problem spaces have grown exponentially large, artificial intelligence and machine learning techniques have been developed to help cut down on processing time and increase confidence in results. AI/ML techniques pose their own problems however, and they still require training with large amounts of data before the answers can be trusted. Once the training phase has been completed, AI/ML has been proven to provide consistently reliable answers and even uncover truths that would have previously been impossible for a mere human to uncover.
In the case of financial markets, historical data is used to try and predict future outcomes and trends within a market sector. However, certain factors can affect this data such as governmental policy changes, currency fluctuations, and other economic factors not easily uncovered by looking at the raw numbers. Then, you would get the data from other sources such as news articles and find a common index by which to match the data types. In this instance you would use the date and time when the article was published. The next problem is that the two data types are vastly different from each other, i.e., columns of numbers versus paragraphs of text. This is where you could then use a modern technique called NLP/NLU*, to identify the tone of an article, for instance, company X has performed well, country Y is having financial difficulties, and use that data to ratify how this affects the numbers. You are now in a much better position to identify events within your financial market or predict how future events might affect it.
Case Study: Boeing
Simulations were used by Boeing to help develop the 787 Dreamliner. Overall, more than 800,000 hours of HPC compute time was logged simulating different wing designs for their new plane. In the end, they only needed to physically prototype 11 wing designs, as opposed to 77 prototypes required for the 767.
Case Study: Callaway Golf Clubs
Callaway models new club designs virtually using high-performance computers. Jobs that once took 40 hours to complete can now be done in just eight hours. The faster processing time means the company can test more design options.
Case Study: Procter & Gamble, Folgers Coffee
P&G switched from metal cans to plastic packaging for its Folgers coffee to improve the freshness of the coffee and to reduce costs. During shipping, however, changes in atmospheric pressure when drivers went up and down mountains could cause the plastic packages to implode. So, P&G used high-performance computers to design packaging that can withstand the pressure.
Boston Ltd have been supplying HPC solutions for over 30 years. We are here to guide you in your HPC journey, helping you to uncover truths about your data, and empowering you to make your next business-critical decision.