Next month, the Astronomy Department will begin using a new telescope camera that stores a terabyte — 1000 gigabytes — of images per day on a high-tech supercomputer installed last November at West Campus.

The processor, called the “BulldogM” cluster, is part of the University’s Science Cores Initiative, launched in the fall of 2008 to expand Yale’s high-performance computing facilities, which use multiple computers or computers with multiple processors. This technology allows scientists to solve the same problem faster than a standard computer and to solve more complex problems, explained Andrew Sherman, an associate research scientist in the Department of Computer Science. Yale’s 16 processors are used in a variety of applications — from astrophysics to economics — and Yale scientists said they hope to continue the University’s acquisitions, though the recession may pose an obstacle.

“It’s very difficult to do large-scale science these days without high-performance computers,” Sherman said.

Scientists use the supercomputers to store and analyze the large amounts of data generated around Yale, as well as model complex physical, chemical and biological systems, said Steve Girvin, deputy provost for science and technology.

Yale purchased its first supercomputer, “BulldogA,” in 2004, followed by eight more until 2008, when the Science Cores Initiative enabled the University to purchase three more processors in fall 2009, as well as to increase the computing capacity of some of its existing machines.

In total, the University has bought 15 more such machines, including “BulldogM,” which was installed in November 2009. The University’s 10 operational supercomputing clusters are currently located at one of two Yale facilities: West Campus and the Biomedical High Performance Computing Center at the W.M. Keck Foundation Biotechnology Resource Facility at 300 George St.

The University recently bought a petabyte — a billion megabytes — of storage, largely for DNA sequencing at the West Campus, Girvin said. In addition, a new temporary data center is being built at West Campus, though Girvin said long-term expansion will require a permanent center. He said he is also creating a faculty committee that will provide advice on University development of scientific computing resources.

Supercomputers now cost around $1 million each, Sherman said, with supercomputing making up about 5 percent of the Information Technology Services budget. And Sherman said the Keck Foundation facility, which is affiliated with the School of Medicine, has applied for grants from the National Institutes of Health to add more processors to its existing grid.

“We have … project[ed] forward the rapidly growing hardware hosting needs for the University and committed the funds needed to build this infrastructure,” Girvin said in an e-mail Wednesday. “We have been putting in some Yale money for the hardware itself but still need to rely heavily on grant funding, especially in the current budget climate.”

Currently, the West Campus clusters work on projects such as models of weather and climate patterns and simulations of the formation of galaxies, as well as other computer science applications. The clusters at George Street, installed five years ago, focus on the sequencing of human and animal genomes through DNA analysis.

“It’s a very exciting development,” Paolo Coppi, professor of astronomy and physics, said of the telescope camera and its West Campus processor. “As we gather more and more information about the universe, we need technology to keep up with our capacity for discovery.”

Although the Physics department remains the single greatest user of the University’s supercomputers, their usage is by no means limited to the hard sciences, said Robert Bjornson GRD ’93, the co-director of the Biomedical High Performance Computing Center.

“We’re seeing a democratization of high-performance computing,” he said. “It’s definitely spreading in popularity across disciplines.”

Sherman and Bjornson listed geology and geophysics, ecology and evolutionary biology, economics and the Institution for Social and Policy Studies as examples of unorthodox departments that have recently used or expressed interest in using Yale’s supercomputers.

The University’s high-performance computing resources also played a significant role in several recent faculty hires in the sciences, Sherman noted.

“Part of the process of hiring new faculty members is making sure that they’re satisfied with their access to technological resources,” he said. “That includes an acceptable amount of personal time on one of the University’s machines.”

This semester, Sherman is teaching the computer science course “Parallel Programming Techniques,” which will make heavy use of the high performance computing clusters. Students in a similar course at Howard University taught by computer science professor Wayne Patterson, will be able to tune in to Sherman’s lectures remotely via a live video and audio feed and will have limited access to Yale’s high performance computing resources.