While personalized medicine is shifting from experimental to mainstream, many health organizations are realizing that not only is the IT infrastructure to support the necessary components lacking, but that they look very different than what typically supports a patient-centered health system. In a recent Healthcare IT News article covering a Black Book Market Research Survey, precision medicine was described by respondents as “very hard to implement.” The data required looks very different than what typically supports EHR-driven and connected systems, as it depends on “fine-toothed data that most hospitals simply don’t have.”
Last week in San Diego, leaders in life science research came together to discuss the convergence of science and information technology. Converged IT Summit, hosted by joint efforts between Cambridge HealthTech and BioTeam, presented opportunities to not only network with peers and carefully selected vendors, but discuss how data is being used to deliver on research missions. Not surprisingly, with $1 billion in play, the hot topic was preparation to act on the United States' initiative for transitioning cancer from life-threatening disease to one that can be maintained without harsh immune system impairing treatments.
When one thinks of healthcare IT, visions of doctors and nursing typing away on laptops mounted on wheeled carts comes to mind. Lately, to improve personalized service, hospitals have shifted these stations to “charting rooms” that look like a cramped college computer lab from the 90s, before every kid had a computer in their dorm room. However, healthcare IT that supports research looks very different. As innovative research into personalized and precision medicine increases, how these two opposed arms of the same system operate becomes of increasing focus.
Independent Research Institutes play a key role in discovering everything from new vaccines to understanding the natural world better to providing both protection and new resources to better living. Mostly non-profit research organizations, these centers of science are supported primarily through grants. This reliance on grants comes with unique pressures and requires strategic management of resources to deliver on missions within funded timeframes.
These institutions are also reliant on federal policy which directly impacts funding and research initiatives. The Association of Independent Research Institutes (AIRI) works to inform its members of new developments in Congress and across the federal government of issues, policy development, and risk related to biomedical and behavioral research and health care. Beyond policy, the association provides opportunities for learning through its conferences and webinars.
In early 2015, we worked with Cambridge Healthtech Institute (CHI) to conduct a survey to understand where research organizations were in regards to building a scientific IT infrastructure that supported research goals. The responses showed disparity in many areas, but allowed us to identify five areas where these life science organizations consistently had relatively uniform opinions:
At Bio-IT World this week, attendees had more interest in the cloud than ever. The life sciences industry is seeing an opportunity to do more with less infrastructure and correct the mismatch between research demands and IT budgets.
As we head into a week of industry-specific conferences, the prediction of the rise of high-performance computing becomes reality. HPC Wall Street and Bio-IT World are not your typical Supercomputing conferences attended by those who work for supercomputing centers married to academia or national laboratories. These conferences are filled with systems engineers and IT professionals living in an HPC environment that was only recently built to support new demands.
The pharmaceutical and drug discovery industry brings life saving medicines that can dramatically improve patient care. While globalization and advances in science secure the industry’s future for decades to come, it is not without hurdles that threaten to slow advancements. Recently, changes in regulations coupled with new healthcare payer models challenge past business models. But, perhaps one of the more controllable threats is taking place in within the company’s walls.
According to a PWC report, From vision to decision Pharma 2020, productivity has flatlined for some time, yet processes in research and development internally have been slow to change. Pertaining to development, the macroenvironment is positive which should be a catalyst for improvements. Processing power has become more plentiful and affordable, advances in genomics has made new research possible, and computer technology has enabled more data—from analysis to patient outcomes—to be efficiently used to test, collaborate, and discover more.
Challenged by time-sensitive, collaborative projects that require large data sets, the scientific community has welcomed cloud computing as a possible panacea to big, difficult IT challenges. The use cases span all types of science from space exploration to geology and from genomics research to artificial intelligence. All of these use cases have something in common: an ever-increasing demand for more compute and storage resources. Today, the demand for such resources may mean turning toward public clouds.
Since cloud is just moving into the mainstream, the use cases for cloud bursting are still being formed. Two early adoption bursting use cases already being tested and approved are in the life sciences. First, research and clinical environments using genomics data for discovery and treatment are bursting applications to gain compute to accommodate demand from researchers (cloud for genomics research). Second, bioinformatics companies are looking to move workloads to the cloud to assist in developing complex analytical algorithms to better interpret large genomics data sets (cloud for scientific computing).