With powerful supercomputing abilities and a support team in place to help users, NIU’s new Center for Research Computing and Data (CRCD) is poised to usher in a new era of big-data scientific research on campus.
Just consider some of the center’s ongoing projects:
- Assembling and annotating the genomes of two species of petunia.
- Modeling the impact of urbanization on regional climate in East Asia.
- Developing applications for next generation nuclear physics experiments.
- Aligning genome sequenc
- All Posts
- es of human, primates and mammals to obtain evolutionary relationships.
- Developing image reconstruction software using Proton Computed Tomography, which tracks the pathways of protons as they travel through a human body.
The center is available for use by faculty, staff and students whose research requires high performance computing. Typically, that means projects that involve complex modeling and simulations or parsing of huge data sets that would be unwieldy for ordinary computers.
“By offering high performance computing to our campus, the CRCD will strengthen NIU research, scholarship and artistry across all disciplines,” says Jerry Blazey, interim vice president of Research and Innovation Partnerships. “I urge the NIU community to take advantage of this tremendous resource.”
“If you have a research idea or project that might benefit from high performance computing, we want to be there to assist you in any way possible,” Karonis says. “Supercomputers already play a large research role in engineering and the natural sciences, and their usefulness is extending into the arts, humanities and social sciences as well.
“For students who are pursing research, access to this center can be an invaluable part of their education, regardless of the discipline,” he adds. “Students will get hands-on experience learning high performance computing applications. These skills are no longer a luxury. They’re a necessity to be competitive in research and in the labor force.”
The center is operated through a joint effort between the divisions of Research and Innovation Partnerships and Information Technology. It grew out of the acquisition of a high-performance computer cluster in February 2012, which was used to ramp up on-campus capabilities to sort and analyze large quantities of research data.
The hybrid GPU/CPU supercomputing cluster has a capacity of more than 35 teraflops, meaning it can do more than 35 trillion calculations per second.
Scientists have already used the computing cluster, dubbed “Gaea” (pronounced GUY-uh), for the mythological Greek goddess who was the mother of all, on research that has resulted in about 50 publications, Karonis says. Another 49 projects involving 96 users are ongoing.
In addition to Gaea, the new center also offers use of a cloud computer cluster and a Field Programmable Gate Array (FPGA) system, frequently used to make sub-second calculations in the financial industry.
“In a broad sense, high performance computing makes possible a wide variety of projects that were previously impractical because the time required to find a solution was prohibitive or the size and scale of the project was too large to fit in the memory of a laptop, desktop or workstation,” says NIU physics professor Bela Erdelyi, acting deputy director of the center.
NIU computer science professor Kirk Duffin, research associates Caesar Ordoñez and John Winans, and Karonis are all members of the high performance computing support team, which can help students, faculty or staff determine how supercomputing can assist their research needs.
NIU physics chair Laurence Lurio can attest to its utility. His research team members used the supercomputer cluster for real-time analysis of X-ray images that they were measuring at the Advanced Photon Source at Argonne National Laboratory.
The experiment involved taking rapid x-ray movies of scattering from protein suspensions. The goal was to better understand the origins of eye-diseases such as presbyopia by gaining insight into why the fluid properties of the eye-lens change with age.
“We were measuring a thousand images per second, and without a supercomputer we would not be able to understand the results as they were generated in real time,” Lurio says.