Knowledge is power. In the information age, we can accumulate, create, disseminate, and leverage knowledge in ways we could not have imagined twenty years ago. As a result, we can find answers to massively complex questions that were once beyond our reach—genomic sequencing, the physics of the universe, the intricacies of neuroscience, even the ability to predict the future. In much the same way that PCs and smartphones revolutionized the computing and information gathering capabilities of individuals, advances in cloud computing technology are set to democratize high performance computing (HPC), bringing the power of supercomputers to higher numbers and more types of users. As more people and organizations gain access to HPC capabilities, we can answer increasingly bigger questions and develop even more complex tools, treatments, systems, and services for a broad range of commercial, scientific, and academic applications.
Virtually any of the major global challenges we face today can be addressed through information technology, but most are far beyond the means of end user computing. As revolutionary as personal computing and mobile technology have proven to be, they simply can’t handle the amount or complexity of data and calculations required to address phenomena like climate change, smart cities, digital manufacturing and supply chains, bioinformatics and pharmacogenomics, national defense, cyber security, and the IoT.
The digital transformation of commerce and communication has led to a Big Data eruption — we are creating and collecting massive amounts of data that lack meaning in a raw state, but carry limitless potential for insight and innovation. We must process and analyze these data troves, and do so quickly, in order to extract valuable answers and predictions that will help us combat global challenges and design better products and structures, ultimately to preserve and enhance the quality of human life.
Given the gravity and scale of the threats and opportunities we collectively face, it is imperative that we enable wider and easier access to our best problem-solving tools, HPC chief among them. After all, miracles of modern science and technology do not arise exclusively from places like Livermore Labs or CERN. Sometimes they originate in the proverbial garage or basement — tomorrow’s equivalent is HPC in the cloud.
How Cloud HPC Democratizes Big Compute
In much the same way that cloud infrastructure and everything-as-a-service offerings have made it possible for entrepreneurs, start-ups, and SMEs to leverage many of the same technology solutions as larger enterprises, HPC in the cloud is opening up access to supercomputer level processing power. Purpose-built HPC cloud offerings are more affordable, scalable, and easier to use. By empowering a broader audience of users with HPC capabilities, including those who aren’t Information Technology (IT) specialists, we can accelerate innovation.
More creative individuals asking important questions driven by their particular experience, context, or culture — and able to explore those questions with powerful tools previously reserved for large institutions flush with funding—means more problems solved, and more topics explored, even ones that don’t traditionally attract big dollars. Through collaboration across disciplines and organizations, the democratization of HPC helps build a critical mass for creating a “crowdsourcing” of ideas and approaches, and maintaining a momentum toward discovery and results. In turn, those results will be more readily shared, having been obtained more affordably. Without multi-million dollar investments in proprietary infrastructure, the resulting intellectual property is less likely to be tightly protected.
When infrastructure set-up and debugging tasks are minimized through outsourcing to an HPC cloud provider, more time and resources can be directed to discovery and analysis. With upfront investments greatly reduced by pay-as-you-go access, smaller groups and initiatives can afford to test theories, simulate conditions, and conduct pilot projects before making larger commitments or going after more funding. By nature, many inquiries requiring Big Compute operate at the cutting edge of their fields, and therefore carry significant risk. Being able to conduct their work on a purpose-built platform with the help of HPC experts, with compute resources that can scale with the project, reduces the risk to manageable levels.
In similar fashion, the ability to extract real-time insight from Big Data repositories using HPC also contributes to democratization of knowledge and discovery. With easier access to machine learning, predictive analytics, modeling and simulations, and other forms of advanced analytics, the power dynamic shifts to those who ask the most timely questions and curate the most valuable data, not just those with the most resources.
In the end, enhancing access to HPC resources for start-ups and SMEs is a boon to enterprise players as well. Global challenges require global collaboration, so the participation of groups from developing nations is essential. Likewise, multinational corporations are dependent on robust supply chains and smaller partners. As the digital transformation of manufacturing and the IoT ecosystem continue to evolve, downstream suppliers will need to advance their game in Big Data analytics, modeling and simulation, prototyping, cyber security, and risk management — all HPC-enabled capabilities they would be hard-pressed to develop without HPC-as-a-service options.
In periods of political and economic upheaval, research can be disrupted when government funding and support shifts to more favored agendas. The democratizing of HPC capabilities will create a buffer for organizations that suddenly need to get by on smaller amounts of private grants. Fields of inquiry suddenly deemed controversial, such as climate change science, can continue and develop more independence and resilience despite shifts in funding. In fact, analyzing and predicting weather patterns and disruptions is one of the up-and-coming uses of HPC. There are obvious practical limits to storm-chasing and studying storms in process. Harnessing the power of HPC to model and predict major global events (weather, disease, economic crisis) has far-reaching consequences and accelerates life-saving discoveries when tackling emergent phenomena like megastorms, water and energy shortages, and global epidemics. Accessible HPC platforms also make it possible to study historically overlooked and underserved populations: recent award winning examples include groundbreaking global research on the African diaspora and the hidden writings of women.
HPC and Cloud, the Future of Computing
The emergence of HPC as a mainstream force was recognized at the recent SC16 conference in Salt Lake City. IDC supercomputing experts noted the remarkable rise in ROI for HPC initiatives, particularly in high performance data analytics (HPDA), the intersection of Big Data and HPC. According to IDC findings from 673 installations across government, industry, and academia, each dollar invested in HPC generated $551 in revenues and $52 in profits or cost savings. These returns are sure to increase with wider adoption of HPC cloud offerings. IDC projects robust market growth for HPC and HPDA servers through 2020 and notes that a third of workloads in commercial settings are HPDA related, with academia and government HPDA workloads climbing to 16% to 20%.
Cloud computing plays an important role in delivering the next generation of accelerated platforms and architectures. Accelerated computing is now embedded in cloud platforms, and supercomputing GPU- and FPGA-based workflows and tools are available from top vendors. Developers have access to a variety of GPU-based environments, including NVIDIA Pascal-powered machines, and beyond. HPC infrastructure is increasingly compact thanks to advances in server density and cooling mechanisms. Cloud platforms can automate the bare-metal provisioning and workflow deployment on these systems, delivering the fastest performance and “time to value” available anywhere.
The cloud is undoubtedly one of the primary drivers of HPC innovation, providing on-demand delivery of HPC to a broader range of users and applications. Simulation customers, developers, researchers, and investors alike should be excited about the world of opportunity that is being generated by the convergence of Big Data and Big Compute. As cloud offerings mature and become more widely available, critical confidence will build in HPC’s ability to power innovation and problem-solving at a scale we’ve only begun to imagine.