OpenCalais Metadata: Ticker: 

Carly Fiorina announces presidential bid

Former Hewlett-Packard CEO and John McCain advisor Carly Fiorina announced her presidential bid on Good Morning America on Monday morning.

Fiorina is most well known for her tenure at HP from 1999 to 2005, during which she merged the company with Compaq and laid off 30,000 employees. Fiorina unsuccessful run for the California Senate seat in 2010. She has never been an elected public official.

She is the first female Republican to declare candidacy. She cited her business experience as one of her strongest traits as a presidential candidate.

“I think I’m the best person for the job because I understand how the economy actually works,” Fiorina said on Good Morning America. “I understand executive decision-making, which is making a tough call in a tough time with high stakes.”

Though her time at HP was controversial, she said she believes it is an example of crucial business experience that is necessary for a president.

Fiorina, as the only Republican woman in the race, is one of the few women who have ever sought a presidential nomination from the GOP. 

Fiorina is the sixth candidate to enter the race, after Sen. Ted Cruz (R-TX), Sen. Rand Paul (R-KY), former Secretary of State Hillary Clinton, Sen. Marco Rubio (R-FL) and Sen. Bernie Sanders (I-VT), who announced his candidacy last Thursday and is running as Democratic challenger against Clinton.

The University’s Texas Advanced Computing Center will release a new, high performance visualization and data analytics system known as Maverick in February.

Originally slated for deployment in January, Maverick is the result of the center’s partnership with technology companies Hewlett-Packard and NVIDIA. Maverick will serve as the replacement to Longhorn, the collection of software visualization systems currently operating at the center.

According to Kelly Gaither, principal investigator of the Maverick project and the center’s director of visualization, Maverick will be used for scientific research and was designed as an interactive, remote visualization and analytics tool. The system will assist in analyzing mass amounts of scientific data alongside the University‘s supercomputer Stampede, launched to provide an interactive environment for researchers last March.

The research and information collected and analyzed with Maverick will be publicly accessible by the scientific and engineering community. Though initially designed for researchers, Maverick will be available to students through their advisers, according to astrophysics professor Karl Gebhardt.  

Gebhardt said the purpose of systems like Maverick is the ability for users to obtain huge amounts of data, access it quickly and efficiently manipulate that data with software tools. Maverick will be an improvement upon Longhorn in all of these facets.

“I have been using [the center’s] resources to study black holes, including the largest black holes in the universe, and dark matter around galaxies,” Gebhardt said. “Maverick will be essential for our future work with HETDEX, the Hobby-Eberly Telescope Dark Energy Experiment. We will generate many petabytes of data, with the goal of understanding how the universe expands over time. These results will allow us to understand the formation, evolution and long-term fate of the universe.”

Michael Teng, a computer science graduate student, said Maverick’s ability to collect and analyze large amounts of data is useful during scientific investigations.

“[Maverick] utilizes a lot of graphics processing units to accelerate the visualization of large amounts of data,” Teng said. “A lot of scientific problems have to do with the movement of particles, or something that has millions of parts. The best way to demonstrate what happens in the simulation is to play a video of what happens using the system.” 

Maverick contains 132 NVIDIA Telsa K40 graphics processing units, or GPUs, according to Scott Misage, high performance computing engineering director for Hewlett-Packard. Computer sciences senior Craig Yeh said the large amount of GPUs aids in the speed of data analysis. 

“[GPUs] are mostly used for problems that are easily parallelized, allowing for faster calculations,” Yeh said. “Additionally, you can use the GPUs to render the data into videos or stream live visualizations to the researchers and allow for interactivity.”

Meg Whitman recently made one of her first public appearances as CEO of Hewlett-Packard in a video conference with a group of college technology leaders. The main topic of this meeting was to announce HP’s participation in a “community cloud” for higher education institutions.

The idea is to establish a pool of high-performing computers and servers in one location. Then, researchers anywhere would be able to able to access the pool through the Internet. This project could be a game-changer for college campuses, but it must be extended so that everyone, especially students, can access it. The community cloud is certainly not the first foray into cloud computing, as other services have been around for the better part of the past decade. However, this project could become one of the trend-setters that popularizes the concept.

“Cloud computing” is a general term used to describe the delivery of a service over the Internet. The major caveat of cloud computing is that nothing is stored locally — all data and software is stored in external servers. For example, have you ever wondered why you are able to access Facebook from any device anywhere? The reason is because all your information is stored and managed by Facebook itself, not by your computer or iPhone. The only thing you need is a browser to access this data.

The community cloud in which HP will participate would work in much the same way. This service would allow professors, researchers and, in the future, students to choose from a large variety web-based applications to accomplish a specific task. One of the big advantages of a collective pool is that colleges would be able to collectively bargain with software companies to drive down costs.

Something like this would especially be helpful at UT in times of budget cuts as software licensing costs the University a significant amount.

Cloud computing has the potential to dramatically change the way students interact with their computers.

Implementing cloud computing means that individual computers can be stripped down to just the browser.

Students would no longer have to worry about installing software or storing data on individual machines. Instead, they can connect to an online service that would provide the same functions. And because everything is tied to an external server, students would never have to worry about losing data, applications crashing or updating software.

Everything is externally managed and can be accessed with a click of the mouse.

A cloud computing framework greatly simplifies the computing experience while maintaining functionality. It offers flexibility in that your data is no longer tied down to an individual machine. Your data can be accessed anywhere at any time. There is also no longer a need for extensive storage space or high-end computer features.

The most visible example of this at UT is Blackboard.

Students are able to access class documents, submit assignments and participate in discussions through this system. Now, imagine expanding this idea to everything else. Students would could write and store their essays online and would no longer have to buy and use Microsoft Word. They could play games without having to install them on their computers. The potential for the cloud is unmatched, yet there are also several pitfalls that must be addressed in the coming future.

The first challenge is that the nature of cloud computing calls for a constant Internet connection. Since everything is stored externally, a connection must exist to access anything, including personal documents. A dead Internet connection means no work, period. Another knock on cloud computing is the lack of features in comparison to its desktop counterparts. This situation is bound to change in the future, but today’s web-based applications simply aren’t as full-featured because of development limitations. Finally, the biggest problem with cloud computing lies in its security.

Because cloud computing is externally managed, universities and students cannot control the number of possible security holes. Companies say that stored data is secure but in this past year, companies such as Sony have had large troves of confidential user data stolen.

Cloud computing could become the next big thing to hit consumer markets, but only time will tell whether or not this trend will catch on.

Shi is an electrical and computer engineering junior.