As we head into a week of industry-specific conferences, the prediction of the rise of high-performance computing becomes reality. HPC Wall Street and Bio-IT World are not your typical Supercomputing conferences attended by those who work for supercomputing centers married to academia or national laboratories. These conferences are filled with systems engineers and IT professionals living in an HPC environment that was only recently built to support new demands.
What is HPC?
HPC aggregates the computing power of many processors—directed by specialized software to work in parallel—for completion of large calculations in a practical timeframe.
With HPC, leading companies gain faster discovery, greater accuracy and deeper insight to their science, engineering and business solutions.
Over the last several decades, industries have begun to transition to HPC environments to support modeling and simulation work that helps to do many things—from supporting financial quantitative analyses to designing vehicles.
Complexity Drives Cloud HPC Growth
Many industries are facing the need to manage more and more complex applications that require large datasets. Whereas building the infrastructure resources to support these workflows was once costly to purchase initially and expensive to maintain, the cloud has now made HPC possible for many companies where it was once out of reach. Cloud HPC makes virtually unlimited amounts of parallel computing resources available with the benefit of a pay-as-you-go model. With this amount of control, companies both large and small are finding that HPC workflows are within easy reach.
The HPC Infrastructure as a Service (IaaS) models offered by cloud service providers such as Google Cloud Platform and Amazon Web Services are the fastest growth segment. While government, academia, and research contribute the majority of the rapid expected growth, many other industries have entered into the market and are expected to continue to do so through 2020 according to a study performed by Markets And Markets. As shown in the graph below, no geographical area will be excluded from this rapid increase.
HPC for Wall Street
As mentioned, using HPC for financial quant analyses is a growing HPC segment. The industry relies on a large number of quantitative techniques on which the industry is more often relying on high performance techniques to operate. On Monday, April 4, engineers will gather in New York City for one of the few one-day conferences focused on discussions and education on HPC for financial services organizations. And, Avere will be there not only as an exhibitor, but with a presentation by Scott Jeschonek, Avere’s Director of Cloud Products. In its 13th year, HPC for Wall Street covers big data, cloud technology, low latency, networks, data centers, APIs, scalability, and cost savings for the global financial markets.
April 4, 2016
New York, NY
HPC Cloud at Bio-IT World
In this HPCwire article, Ari Berman, GM of Government Services at Bioteam, is quoted stating that roughly 25% of life scientists would require HPC to do their work. It goes on to explain that, “Bench science is changing month to month while IT infrastructure is refreshed every 2-7 years. Right now IT is not part of the conversation [with life scientists] and running to catch up,” he said.
Well, to support those scientists, Bio-IT World has grown to include more Cloud HPC related information over its last 14 years. This year attendees can choose from 260 speakers discussing themes of big data, smart data, cloud computing, trends in IT infrastructure, genomics technologies, high-performance computing on the cloud, data analytics, open source and precision medicine, from the research realm to the clinical arena.
Avere will be there in force and can be found in booth 536.
April 5-7, 2016
Seaport World Trade Center