Texas Advanced Computing Center

Photo Credit: Sian Rips | Daily Texan Staff

Using the computing powers of the Texas Advanced Computing Center, TACC, at UT-Austin, researchers are making progress on the biological mechanisms of West Nile virus. The computational heavy lifting was done by the TACC’s Stampede2, Jetstream and Lonestar 5.

“Humans get infected with West Nile virus by the bite of an infected mosquito,” said Margo Brinton, lead author and professor of biology at Georgia State University. “The virus replicates primarily in white blood cells in the blood. In a few people, it can cross the blood-brain barrier and replicate in the brain’s neurons, causing (inflammation of the brain).”

In present day, the virus is now in Africa, Southern Asia, Australia, the Middle East, North and Central America and Europe, according to Brinton.

West Nile virus has an RNA genome, or genetic material, which it can replicate in the cells it infects. Certain conserved structures, including a stem loop on the RNA strand, have shown to play an important role in regulating the ability of the virus to replicate its genome.

The researchers had previously discovered a protein called TIAR that interacts with this stem loop structure in RNA. Now, in a paper published in Analytical Chemistry from January 2017, they found that the specific ratio of TIAR proteins to the stem loop in RNA is 4-to-1.

The researchers said it was important because they realized how much more complex the viral interactions in the cell can be with RNA than previously thought.

“We think the cellular protein TIAR acts to enhance viral genome production inside infected cells,” Brinton said. “We are continuing to define this complex viral RNA-cell protein interaction by using other physical chemistry and biochemical techniques.”

Fellow author on the paper and University of Texas Health Science Center researcher Borries Demeler works with a technique called analytical ultracentrifugation, which played an important role in studying West Nile virus.

“Analytical ultracentrifugation is a separation technique in which high centrifugal forces are used to separate molecules of different sizes and shapes mixed in a solution,” Demeler said. “It can then measure the size and shape of different kinds of molecules and indicate if they are interacting, and if so, how strongly.”

Demeler developed analysis software for research that uses analytical ultracentrifugation; however, information processing can require more significant computational power.

“The separation methods used in (analytical ultracentrifugation) are computationally extremely complex and require a demanding algorithm for modeling,” Demeler said. “Due to the very large data amounts, large computers are needed to extract this information, which was done on supercomputers at the TACC.”

Demeler also said researchers used a new technology, a multiwavelength detector, which can separate molecules based on their optical properties in addition to size and shape. Because different types of molecules absorb light at different wavelengths, biological molecules like DNA, RNA, carbohydrates and lipids all show different absorbance patterns.

Using the high sensitivity of the detector, the researchers were able to separate their biological samples.

“It gives us a new way to look at complex interactions occurring in a living cell,” Demeler said.

Using these same computational techniques, Demeler said he expects a wide variety of applications in biological research, especially by studying the interaction between two dissimilar biological molecules like DNA, RNA or proteins.

“The problem we tackled with the West Nile virus is just one example of a nearly infinite range of interactions and systems that could be studied using this technique,” Demeler said. “I expect many new discoveries will result from this ability to examine mixtures in greater detail and help answer important questions in biochemical research, contributing to developing cures for cancer and many other diseases.”

The UT-Austin Texas Advanced Computing Center (TACC) is providing data storage and security for Children’s Optimal Health (COH) to help assess the quality of child health on a community-level basis in the greater Austin area.

The COH is a non-profit organization that collects and analyzes data on the geographical distribution of factors affecting childhood health, said COH representative Susan Millea. Their analyses aim to improve policy decisions affecting children’s well-being.

“Children’s Optimal Health was formed to serve as an independent, trusted third party that could receive data from community partners … to envision and assess what the needs are at the neighborhood level,” Millea said.

The COH analyzes the presence of community assets, such as the distribution of health care coverage, medical clinic location, disease incidence, parental employment, school district assignment, housing cost, parental commute time and more, Millea said.

“We look at data that would not only reflect the geographic distance (to health care services), but also … the social variables that would impact access to care,” Millea said.

The COH also analyzes data regarding a wide range of childhood health issues, such as obesity and behavioral issues, according to Millea.

“When we’re working on projects, we’re looking at health disparities as experienced by children,” she said. “And we’re looking at equity issues associated with those disparities.”

The TACC is providing the COH with both a data storage platform and much-needed security, said Chris Jordan, a TACC representative.

“There’s a reliability component,” Jordan said. “We apply data protection measures within the data center that allow us to have high confidence … that there are backup copies of data stored (and) nothing’s going to corrupt it.”

Much of the educational and health care information being processed by the COH is confidential by order of the Health Insurance Portability and Accountability Act (HIPAA) and the Family Educational Rights and Privacy Act (FERPA), Jordan said. In addition to ensuring that the COH’s data won’t be lost, the TACC is also providing access-related security measures for these sensitive personal records.

“The essential part is that we have both a HIPAA and FERPA-compliant secure data environment,” Millea said. “(An environment) that can be trusted by all of those who would choose to share their data with us.”

In addition to storage and security, the partnership may also allow the COH to more easily collaborate with UT researchers, the Dell Medical School and the Population Research Center, Millea said.

“What the TACC system is setting up for us is an increased potential to work more closely with university systems … and the data that they’re using,” she said. “They are also working hard to address these issues that affect children and families.”

Analyzing data to better inform social policy is a new and expanding area of study, Jordan said.

“This is a good example of how we see the ubiquity of data and statistical analysis,” Jordan said. “It’s extending the reach of institutions like TACC and the way that we’re able to have an impact in areas like the social sciences.”

The Texas Advanced Computing Center will undergo a $20 million expansion to its facilities at the J.J. Pickle Research Campus over the next year. 

“We were asked what our highest priority was for the center, and one of those was expanding our space and research capabilities,” TACC spokeswoman Faith Singer-Villalobos said.

With the UT System Board of Regents giving the project final approval Thursday, construction is set to begin later this year and
projected to be done by January 2016. The System will fund $10 million for the project, while an anonymous donor put forth the other $10 million. The new building will include office space and a “visualization lab,” located on the northeast quadrant of the J.J. Pickle Campus. The 1,500-sqaure-foot lab will consist of large, flat panel monitors for researchers to observe data. 

“It will be a state of the art facility, and it will allow researchers to study large-scale data analysis and visualizations at extremely high quality,” Singer-Villalobos said.

According to Singer-Villalobos, the facility is classified as a “comprehensive cyber-infrastructure,” which is a technological environment dedicated to research and science. Singer-Villalobos said that there has always been a visualization lab on the UT campus, but TACC recently expressed the need for one of its own on the J.J. Pickle campus.

“TACC’s primary goals are to provide the office space that would consolidate staff from multiple locations into one,” senior project manager Jim
Shackelford said.

The building will house about 70 new employees and 20 students, according to Singer-Villalobos, and there will also be an auditorium for 260 people and a “flexible training” room for 50 people. 

Shackelford said TACC provides research capabilities to those at UT and researchers from around the world.

“TACC has really put UT-Austin on the map as a technological hub,” UT spokesman Gary Susswein said. 

Singer-Villalobos said the TACC has been on campus since 2001 and conducts research in the field of advanced computing while also providing resources such as data-driven computing, data analysis and cloud storage. She also said TACC supports more than 3,000 active and funded research projects.

“I think you’re going to see a lot more research being done with this expansion,” Susswein said.

Photo Credit: Sam Ortega | Daily Texan Staff

The University appointed Dan Stanzione as executive director of the Texas Advanced Computing Center, or TACC, July 1 and ahead of what is expected to be a busy year for the center.

Stanzione has served as TACC’s deputy director for more than five years, and his recent appointment to the executive director position comes after serving as acting director since January. The center’s stated mission is to design and provide extremely powerful computing capabilities for use by the open scientific and engineering research community.

Stanzione has supervised the creation and implementation of multiple of TACC’s computing systems in the past, and, in addition to his new direction duties, he will serve as principal investigator for the execution of Wrangler, the center’s upcoming data analysis management system. Wrangler’s primary focus will be memory and data-intensive applications.

“We want to let people solve their problems and solve these problems faster,” Stanzione said.

Santiago Sanchez, a biochemistry and Plan II junior in the Freshman Research Initiative, said he has used TACC resources to streamline computational problems in his research. Sanchez said some simulations the group has run would be too large in data size for the initiative’s internal computers to run efficiently.

“TACC allows us to supercharge our simulations and then transfer the vitally important information to our own disks,” said Sanchez.

Charles Jackson, a research scientist at the UT Institute for Geophysics, said TACC resources have allowed him to run a variety of experiments simultaneously and scale up his climate models, which produce huge terabyte-scale data sets.

“TACC is really good at running hundreds of experiments at a time,” Jackson said.

Large bodies of data and their consequential bearing on the STEM fields, as well as other areas such as the social sciences and business, are not lost on Stanzione and TACC. Stanzione also stresses the importance of large data sets, or "big data," in the future, saying big data represents a conglomeration of problems and technologies people will need to solve in all areas of life in coming years.

“There has been an explosion of data in science, as well as outside of science,” Stanzione said

Sanchez said big data has the potential to become central to decision-making in multiple fields, including business analytics and healthcare. 

“I feel we’re moving into an era where no decision will be made without petabytes of information behind it,” Sanchez said. 

At the helm of TACC, Stanzione has an expansive plan for growing TACC’s technology and ubiquity.

“My main goal is to diversify what we do,” Stanzione said of his concept for TACC’s future.

Stanzione’s plans for the close future include expanding staffing, launching Wrangler next January, creating event space for public exhibitions, and opening up a new building that will include more meeting areas for groups of scientists and engineers utilizing TACC resources in their work.

Photo Credit: Sam Ortega | Daily Texan Staff

A new supercomputer at UT will transform numbers into pictures, an intuitive way of sharing information. 

Beginning in January 2014, students and faculty will have access to a new supercomputer called “Maverick,” which specializes in visualization and data analysis.

The Texas Advanced Computing Center, a national supercomputer facility, recently announced its anticipated introduction of Maverick, a supercomputer that will replace its current counterpart, “Longhorn.”

“This will be a whole new system with … [a] faster processor, significantly more memory for each processor and top of the line graphic processing cards,” said Niall Gaffney, the center’s director of data intensive computing.

Gaffney said in the research process, visualization is essential to reveal patterns and trends in data that scientists may otherwise have missed.

“Visualization can show you things you weren’t explaining, which is important when you’re doing research,” Gaffney said. “I call that the ‘aha’ process. You look at something and say, ‘Oh that’s funny.’ Often the only way you find things is looking at things differently than the way you normally look at them.”

Computer science senior Bo Chean said transforming data into pictures, Maverick’s specialty, makes analyzing information easier.

“When you have words and numbers, there’s an extra step your brain has to go through,” Chean said. “Pictures are more intuitive.” 

In the past, supercomputers at UT have been funded by the National Science Foundation, which has required they be available for scientists across the country. Because funding for Maverick came from the O’Donnell Foundation, a private donor and longtime supporter of UT and the center, Gaffney said the center would be able to reserve more of Maverick’s use for students and faculty at UT.

“This is a system not being funded by National Science Foundation, so we’re running this for the folks we will be working with,” Gaffney said. “About 50 percent [of its use] will be reserved for people here in the UT system.”

Data mining in social media has recently been a source of large amounts of data, and scientists and statisticians are beginning to explore its applications, Gaffney said.

“You could use this information real time from social media sites to do things very powerfully you wouldn’t be able to do otherwise,” Gaffney said. “We want to push forward on that from the data mining side and explain to people what’s going on.”

The center’s Deputy Director Dan Stanzione said in the digital age, people can easily generate large amounts of information, but the difficulty is finding significance in large data sets.

“Our ability to generate data is huge,” Stanzione said. “It’s easy to generate trillions of bytes of information. That’s way too much information to read. It’s one thing to say you have a hundred terabytes of information about cancer and it’s another thing to say you know what that means.”

A $6 million grant will go to the Texas Advanced Computing Center at UT and its partners to fund the development and production of Wrangler — a new data analysis and management system for the national open science community.

The computing system is scheduled for production in January 2015 and has already been designed in principle, according to Jay Boisseau, the director of the computing center. Boisseau said Wrangler’s storage system will be large enough to store hundreds of national research projects in a safe and reliable way. Indiana University, a partner in the project, will have a replica of the storage system so researchers will able to access data from both.

“[Wrangler] will be the most replicated, secure storage [system] for the national open science community,” said Dan Stanzione, the deputy director at the computing center. “Wrangler will be one of the highest performance data analysis systems ever deployed.”

Boisseau said once the system is running, any researchers from any university or government labs can access it. He said Wrangler will be free to those who apply and compete for use of the system and said he hopes UT researchers will use it frequently.

“We hope that UT will embrace and play a large role in the sciences that develop,” Boisseau said. “We’re very excited to get a chance to represent the saying ‘What starts here changes the world.’”

Dell Inc. and DSDD Inc. are partners of the computing center for this project.

“Not all the technology for the system has been developed yet,” Boisseau said. “The two partners are crafting the system on site so it can go into production in early 2015.”

However, Boisseau said the computing center is the leader in the project because it has the high-end analysis site.

“We’re showing leadership in creating the most capable storage system with a unique analysis system,” he said. “We hope this will help establish TACC as a leader in the data intensive sciences.”

The National Science Foundation granted the initial $6 million award for the deployment of Wrangler, but Boisseau said representatives of the center made a request for an additional $6 million after the production of the system. He said the funds will be split by the partners contributing to the development of the system.

Bob Chadduck, from the National Science Foundation directorate’s division of advanced cyber infrastructure, said Wrangler advances the vision to tackle complex data-intensive challenges and problems.

“The National Science Foundation is proud to support the community-accessible, data-focused resources to advance science, engineering and education,” Chadduck said. 

Hundreds of Xeon Phi coprocessors fill tables in the Texas Advanced Computing Center. The newly introduced coprocessor, designed by Intel, is the innovative component of the TACC’s Stampede supercomputer (behind).

Photo Credit: Pearce Murphy | Daily Texan Staff

The University is trying to stake a claim as a leader in interdisciplinary science research with the recent installment of the world’s most powerful academic supercomputer at the Texas Advanced Computing Center (TACC). 

The system, named Stampede, became operational on Jan. 7. The TACC staff and Dell engineers installed and tested the supercomputer during a six-month period, said Tommy Minyard, director of advanced computing systems at TACC. The National Science Foundation (NSF) funded the initial $27.5 million cost as part of its “eXtreme Digital” program and will continue to fund Stampede operations for four more years. 

President William Powers Jr. said the addition of Stampede to the University’s facilities only augments its prestige as a premier research campus. 

“Stampede is a game-changing supercomputer that reinforces UT’s role as a supercomputing hub and a world-class research university,” Powers said in an emailed statement. “It will help scientists solve some of the world’s most pressing problems and it will promote collaboration across campus and across the country. Jay Boisseau and the faculty and staff at TACC are at the heart of something very big.”

TACC was founded in 2001 and is located on the J.J. Pickle Research Campus, according to the TACC website. It is one of the top centers for computational science used by researchers nationwide. 

Stampede’s power is derived from 6,400 Dell servers that each contain two Intel processors, according to Minyard. He said Stampede’s high-speed network means applications are able to efficiently run with many processors simultaneously. 

Minyard said Stampede is available to many U.S. researchers who can apply for time on the system through the NSF.

“Most of the time is allocated to NSF researchers,” Minyard said. “However, 10 percent of the system is available to UT researchers [since UT is] hosting the system. The system will be used to solve a wide range of problems from almost all science disciplines, such as computational chemistry and physics, astrophysics, computational fluid dynamics, weather and climate modeling, computational biology, etc.”

Reuben Reyes, senior systems administrator for the Bureau of Economic Geology at the Jackson School of Geosciences, said Stampede is great to work with because of its capability to rapidly solve large problems. 

“It has its advantages and disadvantages,” Reyes said. “The biggest advantage is you can scale up very complicated problems at a very high level.”

Despite the system’s ability to solve problems quickly, there are some issues with accessibility to the computer, Reuben said. He said waiting for a problem to be solved by Stampede can sometimes take longer than the solution itself.

“You’re in a queue waiting for your process to take off,” Reyes said. “Once it takes off, it’s solved really, really fast, but let’s say I need larger resources and in the queue I’m in, it may take longer for the supercomputer to get to it.”

Printed on Thursday, January 17, 2013 as: UT computing center installs world's most able supercomputer 

A team of six UT students won the SC12 Student Cluster Competition by building a high-performance computing system while competing against teams from around the world in Salt Lake City, Utah.

The competition required six-person teams of undergraduate or high school students to partner with vendors to create a computer cluster from commercially available technology that could be powered by no more than 26 amps. Three standard coffeemakers would run on about 26 amps. A cluster is a network of computers that are wired together to distribute data and process information.

This was the third time UT participated in the competition, which is in its seventh year and is part of an international computing conference. The competition included students from around the country as well as teams from Europe, Canada, China, Costa Rica, Germany, Russia and Taiwan.

The students prepared for the competition with help from the faculty at the Texas Advanced Computing Center, which covered the cost of the equipment. John Lockman, statistics graduate student and team mentor, said the students practiced their skills at the center before competing.

“A lot of the preparation was here at [the center] so they could work with a lot of the equipment to understand how they would use it,” Lockman said. “They also ran some of the standard benchmarks that they would during the competition.”

The two-part competition consisted of initially creating the cluster and a 48-hour challenge to stress the cluster’s capabilities. The system ran applications containing large amounts of scientific data, including a program that could predict weather patterns.

Physics junior Julian Michael, who participated in the competition, said it required strategic planning because it was designed to give the teams more applications than the system could actually run.

“Through the whole competition, they had a cap on our power usage. More than anything else, this was a test of how efficiently we could build and run our system,” Michael said.

Computer science senior Anant Rathi, another participant, said after months of preparation the team decided to make a few last-minute technical changes to its plans, which helped it win the competition.

“In addition to learning about computing science and system administration, it taught me a lot about teamwork and innovative problem solving,” Rathi said. “We had to make some game-changing decisions. It was great to know our judgment calls were correct.”

Computer science senior Michael Teng said aside from the competition the conference demonstrated the latest innovations in the industry.

“It was a great opportunity to learn more about the field and use state-of-the-art technology in the process,” Teng said. “There are a lot of new technologies displayed and announced at the conference. The growth of the industry is key to scientific and technological advances for the foreseeable future.”

Printed on Wednesday, Nov. 28, 2012 as: Students snag first place in computing challenge

The supercomputer in the Texas Advanced Computing Center (TACC) will receive a $10 million donation to advance the science it does. Astronomy professor Karl Gebhardt is one member of a team using the computer, and he says his research will benefit from the advancements by processing a more enormous amount of data.

Photo Credit: Raveena Bhalara | Daily Texan Staff

The Texas Advanced Computing Center will receive a $10 million private donation to advance the supercomputer center’s data-driven science.

The Peter O’Donnell Foundation has donated in past years to support the University’s research efforts at the computing center, which will use part of the money to construct a computing system to handle and analyze large amounts of data. Alison Preston, an assistant psychology and neurobiology professor, said the donation will help the Preston Lab, where studies are being conducted to show how the brain implements human memory.

“When we test individuals in the CAT scanner, we take thousands of pictures of their brain across the one to one-and-half hour that they are in there,” Preston said. “You need a lot of space to store large quantities of data and that is one thing new resources offered by the funding will provide.”

Preston said the donation provided to the computing center and its advanced data system will help analyze data more quickly.

“Using a personal computer could take several days to fully analyze an individual subject,” Preston said. “It will allow us to speed up the analysis of data, and we can therefore answer the scientific questions that we’re interested in.”

Astronomy professor Karl Gebhardt said he and his team will use the computing center as a data storage and analysis base for their observation of the expansion of the universe.

Gebhardt and his team will trace the detailed expansion of the universe through the Hobby-Eberly Telescope Dark Energy Experiment, he said.

“Our project will have an enormous amount of data, about three to five years total, and we simply do not have the funds within our project and department to handle such an amount,” Gebhardt said.

Printed on Thursday, February 23, 2012 as: Computing center receives $10 million

Kris Dey, 7, and his brother Ryan, 10, play “H

Photo Credit: Pu Ying Huang | Daily Texan Staff

Video gamers gathered at UT’s Applied Computational Engineering and Sciences Building on Wednesday evening to discuss every player’s dream — infinite resolution and zero latency. The conference, called “The Future of Video Games in Austin,” showcased innovations made by the University, local video game companies and big-name hardware makers including AMD, Dell and Microsoft, said Rob Turknett of the Texas Advanced Computing Center (TACC), which hosted the event.

UT began offering classes this year aimed at preparing students for the video game industry, said Turknett. Turknett said the event fit into that curriculum by “bringing the gaming industry together with UT.”

The event opened with industry figures speaking about their visions for the future of video games, said TACC spokesperson Faith Singer-Villalobos. Jon Jones of the development firm Smartist LLC, said that the future of gaming would be more flexible and nimble. Jones said the future of video games lies in “mercenary agencies of developers that would move nimbly from one project to another.” Radio-television-film professor Bruce Pennycook said he saw a shift from PC and console games to more “rapid turn-around casual games” on mobile devices.

Similarly, Mike McShaffry, director of product development of Red Fly Studio said the next generation of Microsoft’s Xbox would fail.

“I think people will be playing on these,” McShaffry said, waving his smartphone.

Computer science and radio-television-film sophomore Wilson Villegas said he hoped to use the event to network. He said he transferred to UT because of Austin’s video game industry.

He and computer science sophomore Andrew Sharp said they were certain they and other UT graduates would find jobs in video game development if they remained agile and independent.

“My friends are graduating without jobs and they’re not even worried about jobs,” Sharp said.

“They’re confident they can start their own startups.”

Industry veterans said those interested in video game development should be able to find careers.

“Things have really started to tilt in favor of small independent game developers,” said Dan Magaha, the executive producer of Seamless Entertainment.