附近上门

Business

Learn, Observe, Control: UChicago To Commercialize A Generalized Software Optimization Framework

On its face, Ryerson Physical Science Laboratory seems like an unlikely place to find a new technology that has the potential to change the world. Sitting on the main quad of the University of Chicago, the building鈥檚 hallways and classrooms could be a movie set for a film set in the 1950s. There is a wood-paneled student lounge, and the whole building smells a bit like an old box of pencils. It鈥檚 part of a dwindling breed of academic buildings.

Ryerson is currently home to UChicago鈥檚 computer science department, which, at least when I was an undergraduate there, had the perception of being a bit stodgy and overly theoretical. What may have been true just a few years ago is changing very quickly today. The department has made a few important hires. This includes Intel鈥檚 former VP of research 聽as a professor and director of the CERES Center For Unstoppable Computing.聽, formerly the chair of Berkeley鈥檚 top-rated computer science department, was hired as UChicago鈥檚 department chair.聽This has set the university鈥檚 computer science department on a path toward being, in Franklin鈥檚 own words, 鈥溾 in theory 补苍诲听applied science.

Follow 附近上门 News on &

In 2013, UChicago鈥檚 computer science department also brought on another person, 聽(pictured below). Michael Franklin told 附近上门 News that Hoffmann was brought into the computer science department when it built out its computing systems research group.

Among other earlier projects in his career, Hoffmann helped to modernize billion-dollar radar systems to detect and intercept hypersonic ballistic warheads at MIT. Hoffmann also built a programming interface for at , a startup also spun out of MIT. But Hank鈥檚 most important work to date is , which was the basis of .

Hoffmann鈥檚 framework for so-called 鈥淪elf-Aware Computing Systems鈥 (e.g. SEEC) was deemed a , and now in 2018 this idea may change the face of the tech business.

鈥淗offmann鈥檚 work is one of those all-too-rare examples of something that pushes a field forward, academically speaking, that also has significant commercial promise,鈥 said Michael Franklin, whose lab created the popular聽.

鈥淚 feel like I could never call it [鈥榯he Hoffmann Optimization Framework鈥橾 in an academic setting because I think people would just laugh at me. […] As long as I鈥檓 not specifically quoted as using my own name to refer to the framework, we鈥檙e good,鈥 Hank said in an interview with me. To be clear, Hoffmann doesn鈥檛 seem like the type of person to brand something with his own name. That鈥檚 been attached to the framework by his commercial partners, but mostly in the form of its initialism: HOF.

Why Hoffmann鈥檚 Optimization Framework Matters

Machine learning is eating the world. It is so prevalent in Silicon Valley tech startup pitches today that it鈥檚 basically become an inside joke. Seemingly everything needs to be controlled by a neural network, the deeper the better. But machine learning techniques are being used in all kinds of applications that are less flashy than some whiz-bang tech startup.

Neural nets are optimizing everything from cancer diagnosis to pneumatic braking on freight trains, and everywhere in between. All because of some fancy linear algebra, a computer is able to develop and evolve statistical models of how systems operate and then perform actions based on those models. But that learning process takes computational resources, lots of high-quality data, and, most critically, time.

Re-training a system in the face of new information 鈥 like a shock to the system, or encountering something that鈥檚 completely different from what the neural net has been trained to understand 鈥 is not always feasible, and certainly not in situations where speed is important.

鈥淭he Hoffmann Optimization Framework is the world鈥檚 only AI 鈥榠nsurance policy鈥 that can optimize and supervise any legacy or new system to guarantee performance to your goals, fully responsive to what is both known and unknown,鈥 said Lester Teichner, Partner in the , which has partnered with the University of Chicago to bring Hoffmann鈥檚 work to market.

So, if neural nets are good at developing a statistical understanding of a system, why not turn that gaze inward and learn about the system itself and how it works best, such that when something changes, it can respond?

Making Machine-Taught Systems 鈥淭hink鈥 On The Fly

The Hoffmann framework is able to extract additional performance from complex systems that have already been optimized using machine learning, narrowing the gap between the absolute best-case scenario and what you can actually achieve in practice.

It does so dynamically, and in real-time. HOF ingests data produced by a given system and makes on-the-fly adjustments to how that system operates in order to maintain the best level of performance, even in adverse conditions. The framework is able to deliver performance guarantees formally proven with cold hard math.

And, perhaps most importantly, the framework is abstract enough to be applied to basically any complex system to make it perform better. And it does so rather unobtrusively, by simply sitting atop an existing system.

The framework can be implemented by software engineers with relative ease through a software development kit (SDK) developed by Hoffmann and his research group. Hoffmann said, 鈥淥ne of the things I’ve wanted to do is remove as many parameters as possible from users, so that users could benefit from the combination of control systems and machine learning without having to learn how the control mechanisms are working at the operating system and hardware level. At that point you’re just swapping one problem for another.鈥

Self-Aware Computing In Practice

I鈥檝e been privy to a few results from tests of the framework, so here鈥檚 what that looks like in practice. This is where the 鈥済eneral鈥 in 鈥済eneralized optimization framework鈥 really comes through, because it鈥檚 been implemented in many different kinds of application areas with generally impressive results.

Here are a few examples of its proven results today:

  • An interesting implementation in autonomous vehicles, described later in this article.
  • Hoffmann’s framework was recently implemented on Argonne National Laboratory’s Cray XC40 supercomputer running a popular (at least in academic circles) molecular dynamics simulator called LAMMPS. The optimized system produced “average increases in analyses completed of about 30% compared to the current state of the art,” according to Hoffmann. (Let that one sink in a moment.)
  • Implemented in a generative adversarial neural network learning a set of training data, the framework helped to produce results that were materially more accurate than the control, according to the tester.
  • In laboratory conditions, on phones with multicore ARM architectures, researchers using HOF achieved “2x” energy savings over the control sample by changing how the phone’s operating system allots resources during computationally-intensive tasks.
  • In , one of the most prestigious academic AI conferences, an implementation of the framework eliminated out-of-memory failures on test installations of Cassandra, HBase, HDFS, and Hadoop MapReduce. The framework produced a set of configuration settings which delivered better performance than even expertly-tuned settings, and this was all accomplished by changing as few as eight lines of code.
  • A chipmaker said that the framework’s results are equivalent to a generational upgrade, delivering next year’s performance expectations today.

鈥淲e have dramatically improved performance and lowered energy consumption in every instance we have implemented so far, seeing improvements of 20% or more across the board.鈥 said Lester Teichner. 鈥淭he Framework is widely applicable and has been deployed on mobile and server CPU and GPU chipsets, and on autonomous vehicle platforms.鈥

Ghost In The Machine

For a DARPA-backed trial, and in collaboration with from Rice University and at MIT, Hoffmann鈥檚 framework was installed on a computer encoding video from a camera mounted to the top of a car. The encoder was set to output 20 frames per second at a specific resolution, and these targets were displayed on charts below the video for me to watch. To test the effectiveness of the framework at delivering performance guarantees in adverse conditions, the demonstrators began to kill off CPU cores and slow fan speed on the computer to simulate degraded performance one might encounter 鈥渋n the wild.鈥

The lines on the charts began to deviate from their targets, but very quickly 鈥 like, within a second or two 鈥 returned to basically normal performance as the framework re-allocated its remaining computing resources on the fly. At first the framework overshot the target, then undershot it, but it quickly converged on the pre-set performance targets: 20 frames per second. It wasn鈥檛 until the computer was really crippled that the video began to pixelate and skip frames, but despite the juddering and the boxy-ness, the video encoder kept working. Without the framework鈥檚 supervisory and control functions, the system just crashed at the first sign of trouble.

In that demonstration of the framework being applied to video encoding I was able to see the ghost in the machine 鈥 the mechanistic 鈥渟elf鈥 in self-aware computing 鈥 聽turning the virtual dials to keep running. It was both fantastically interesting and very spooky to watch.

Hoffmann affectionately described the successful test as one of his 鈥減roudest moments鈥 as an academic.

If you鈥檒l forgive the automotive pun, imagine a few years down the road what this capacity to self-heal will mean to autonomous vehicles. If a patch of sensors in the LIDAR device on top of the car malfunction or a tire pops while on the road, Hoffmann鈥檚 framework could detect this failure and reallocate system resources to keep the system up and operating at its best, even in adverse conditions. And the fact that this adjustment happens so quickly matters, particularly in the case of autonomous vehicles which could be moving very fast.

Rather than asking a machine-taught system to re-learn everything when something unexpected happens, HOF takes what the system already 鈥渒nows鈥 about its limits, its theoretically optimal behavior, and the variables (the 鈥渧irtual knobs鈥) that affect its performance to find the best way to operate given the context of its environment.

Our hypothetical car mentioned earlier wouldn鈥檛 need to re-learn how to drive with three functional tires, or how to 鈥渟ee鈥 with partial blindness. A drone that鈥檚 far from its base could operate at a different speed or trajectory to ensure that it has sufficient battery life to make it home. It wouldn鈥檛 need to re-learn how to fly, this time with half a battery. Multi-core computers, or servers in a data center, could reduce energy expenditure by more efficiently allocating computational resources or adjusting fan speeds under heavy load. In a complex system with many layers of subsystems, having HOFs all the way down the stack would make the whole system run more efficiently.

A Generalized Approach To Commercializing A Generalized Optimization Framework

UChicago, in partnership with the Chicago Group, has begun the process of taking the framework to the broader market.

Hank said he was 鈥渓ucky鈥 to have previous startup experience working on Tilera, which showed him the value of focusing on one specific application of a particular technology. But for the optimization framework, he said 鈥淚 wasn’t even sure that starting a company was the right idea, because I thought that this technology was useful in a lot of different contexts. The idea of spinning up a company and having to pick just one of those contexts didn’t seem right. This technology could be in phones, or in data centers, or in a bunch of different areas. What if we picked the wrong one?鈥

So, in turn, this broad licensing approach is itself a generalized, abstracted way to commercialize a generalized optimization framework. 鈥淲e eat our own dog food,鈥 quipped Hank, who concluded 鈥淲e believe in generalization.鈥

附近上门 News has learned that is in discussion with several major internet companies, automakers, semiconductor producers, and electronics manufacturers in the US and abroad. They have also approached a number of venture capital firms to make the optimization framework available to their portfolio companies.

After seeing what this seemingly simple framework is capable of, we鈥檙e also left wondering what other technologies are hiding in the messy corners of professors鈥 offices and research labs both in dusty old Ryerson and at other institutions. In the struggle between research institutions and private industry to create the technologies of the future, academia scored a point today.

Update: We corrected the spelling of a surname following original publication.聽

Illustration Credit:

Stay up to date with recent funding rounds, acquisitions, and more with the 附近上门 Daily.

67.1K Followers

CTA

Discover and act on private market opportunities with predictive company intelligence.

Copy link