From the CTO

21 November 2019
Janet Scannell

I love the phrase “it takes a village” and the sense of collaboration it evokes. We are better off when we share responsibility, whether in support of raising children, addressing societal issues or managing ever-changing technology. For those of us in higher education, the sharing of knowledge is core to our mission and beliefs. In that spirit I want to share a few things I’ve learned recently and the importance of supporting each other in managing and leveraging these innovations. 

Self-driving cars

We’re all aware of the coming of self-driving cars. Actually, they’re here now—in small numbers. I’ve been waiting for this day for 25 years since I first heard about self-learning computer models in grad school in the early 90’s. At the time they were called “neural networks” and were magic to me. What’s now called machine learning (“ML”) and artificial intelligence (“AI”) is still magic in many ways, but it’s becoming real in terms of its influence on our lives.  

At a recent national IT conference, I went to a talk that described advances in machine learning in terms of chess playing. “Deep Blue”, the IBM super-computer that defeated Garry Kasparov in 1997, was programmed with decades of chess knowledge (and between-game tweaks) which it drew on in determining which moves to make. A new algorithm, named Alpha Zero, was only programmed with the rules of chess. Nothing else. Within nine hours, Alpha Zero had built its understanding of chess strategy to a grand master level by playing 400 million games of chess. That’s at least four times as many games as a typical grand master would play in their whole lifetime!

Self-fixing wireless 

So how is that relevant to higher education? Here’s an example. A company named MIST has applied machine learning technology to support wireless networks. They’ve developed custom access points (APs) that monitor 150 attributes (per AP) and then send that data to MIST’s computer systems. Machine learning algorithms analyze trends across the thousands of APs in use by all of their customers. (This far surpasses the monitoring and trend analysis we are able to do at Carleton on our 895 APs). As a result, when a particular access point is malfunctioning, MIST’s algorithms can automatically correct many problems or can verify failure of the device and drop ship a replacement within hours of the moment of failure, dramatically shortening their response time.  

Jobs at risk

Software-driven automation has so far primarily impacted blue collar jobs, like factory assembly and farming operations. However, it is now becoming clear that jobs that typically require a college degree could also be at risk. For example, some institutions are starting to use “chat bots” in place of employees to respond to community questions about financial aid or computer support. There is tremendous power in these examples and a tremendous amount of data that is being gathered and analyzed to take actions that augment the work that humans do. 

Data is exploding, and yet I have a sense that our societal engagement with it is lagging. So I was delighted to return from Educause and attend a talk titled “AI for Whose Social Good?” which was part of the AI lecture series. Among other interesting topics, Luke Stark talked about metaphors around data, including industrial ones (“data mining”), natural resources (“raw data”), and data as a trend (“data is the new bacon”). To say that data is now as ubiquitous and beloved as bacon is a strong statement. 

Too much bacon 

Just as too much bacon can be bad for our health, so can unmanaged data or runaway uses of machine learning and artificial intelligence. I am excited about the possibilities, and I’m delighted with the way that Carleton is engaging the need for interdisciplinary and collaborative conversations about the future that will be here soon.

Appears in Issues: