HPC Harvard’s Way


By Allison Proffitt, Digital HealthCare & Productivity staff

October 14, 2008 | High performance computing (HPC) at Harvard Medical School isn’t just about IT, It’s about bridging clinical and research needs to support research, applications, and storage across the enterprise. John Halamka, Chief Information Officer at Beth Israel-Deaconess Medical Center painted a picture of collaboration and growth around HPC that he believes will be transformative for Harvard Medical School.

Halamka gave the second morning keynote address last week at the Harvard Biomedical HPC Leadership Summit 2008, October 6-7, at Harvard Medical School. “To start with, HPC at Harvard is really a story of Field of Dreams, build it and they will come. Instead of a baseball field we built clusters and it worked really well,” Halamka said.

When faced with the challenge of supporting not only a medical center, but also a research institution, Halamka found that the historic model of decentralized storage and processing was not working. “Every lab wants to do its own thing,” he says of the past view. “Every lab is building a cluster in a closet, before you know it we’re going to end up with dozens of small data centers scattered around the [campus] and clearly that’s not a scalable, or sustainable architecture.”

The solution was a jointly-built high performance computing service supporting research and the medical school. “If you build central shared research infrastructure, everybody wins,” Halamka enthuses. “You avoid building dozens of local data centers, you avoid the problem of having orphan systems that aren’t supported when people leave, but you also understand the needs of the local research.”

The foundation, since the HPC service was put together in 2004, was a centralized infrastructure that the entire community could benefit from. Nodes of storage are bought by individual labs, but contributed to the community cluster. “And then of course we’ll use scheduling software to give you priority on those nodes, but when you’re not using them, the whole community is benefiting from those nodes,” he explains.

The system works. “Over the course of the last couple of years, the school, sure, has provided some storage infrastructure, the data center, and the networks, but the nodes have been largely contributed by the community.”

In the medical school today, 500 active users are spread across all of the labs. The system houses almost 1000 cores, 130 terabytes of network cached storage. With such buy in, of course Halamka’s next big problem is storage. “Storage is one of the things that’s keeping me up at night,” he jokes. “Who would have thought a couple of years ago that the petabytes would be not enough?”

Halamka sees storage at the heart of many of the challenges for Harvard Medical School and other clinical and research facilities, especially as researchers and clinicians are generating massive amounts of data each day. “We’ve put in de-duplication compression storage devices that have a 1:20 compression ratio, so that we’re able to archive this data very effectively. And it de-dupes at the block level, and so we’ve found a really quite effective method to store, compress, and if necessary retrieve relatively rapidly.”

The success doesn’t stop at Harvard. Halamka says, “We’ll continue to expand the existing high performance computer cluster, trying to build it into a regional computing resource. “

Click here to log in.

0 Comments

Add Comment

Text Only 2000 character limit

Page 1 of 1

White Papers & Special Reports

Digital End-To-End for Pharma
Sponsored by SAIC

Given the rapidly changing pharmaceutical industry landscape, traditional methods alone of identifying, retrieving, and analyzing the private and publicly available information no longer suffice. How can Pharma effectively harness the immense amounts of data from an ever-growing number of internal and external sources?

The Digital End-to-End white paper discusses some of the ways that Pharma can harness the power of the data to pursue novel products that are profitable, safe, and have proven value to one or more patient populations.



Adobe® LiveCycle® ES
Deploying Adobe Technology to Automate Electronic Submissions
Sponsored by Adobe
Discover how to:
  • Help reduce the cost of bringing products to market
  • Improve document collaboration across your organization
  • Satisfy global regulatory requirements, stockholder expectations, and customer demands
  • Improve data capture, information assurance, document output, process management, and content services
  • Gain a competitive advantage, get immediate ROI


Interoperability and Architecture for the Life Sciences Industry
What it will take to gain industry consensus on interchange standards, and what organizations can do today to optimize their own information architectures
Sponsored by SAS
Information technology is finally up to the task of cost-effective clinical research. But there are still some significant barriers to gaining all the benefits of this technology, in particular, interoperability. This white paper explores the need for a consistent industry architecture that allows life sciences firms to connect their enterprises and benefit from unified data integration, process consistency and rapid communication of meaningful drug findings.


Life Science Webcasts & Podcasts

Medidata Solutions

Rising Clinical Trial Delays and Costs - Addressing the Cause, Not the Symptoms

Protocol complexity is taking a toll on clinical study speed and efficiency: increasingly complicated and ambitious protocols are not only burdening sites and study volunteers but are also prolonging trials and increasing expenses. In response, sponsors have turned to global study placement, restructured site relationships and new site management practices, but the problem remains.

This podcast will discuss:

  • Why these responses address only the symptoms, not the underlying cause, of rising clinical trial delays and costs.
  • Results of a recent joint Tufts University / Medidata Solutions study.
  • New metrics benchmarking protocol design trends.
  • Systematic protocol design improvements and why they are essential to clinical trial performance excellence.

Speakers: Ken Getz, Senior Research Fellow at the Tufts Center for the Study of Drug Development, and Ed Seguine, General Manager, Trial Planning Solutions at Medidata.

Download Now



More Podcasts

Job Openings

Manager, Scientific Computing & Programming
Lead SAIC-Frederick, Inc.’s Bioinformatics & Analysis Group in developing & maintaining informatics pipelines for generation/analysis of dense genotyping & next-generation sequencing data. Required: MS or equiv. 5 yrs related experience. Knowledge of programming/software development, high performance computing, bioinformatics, project management. Visit www.saic-frederick.com - #130019.




For reprints and/or copyright permission, please contact The YGS Group, 1808 Colonial Village Lane, Lancaster, PA;

(717) 399-1900 ext. 125, or via email to [email protected].