Thinking about digital dæmons, disruption, and marching back into the future

IMG_1630

I’ve been trying to write this post, or a more accurately a version of this post for the last couple of weeks. It all started a couple of weeks ago when I attended  the City of Glasgow College Digital Education Symposium.  It was an interesting day with a good mix of speakers from industry and academia.  Joe Wilson has written a good summary of the day.

In the morning most of the usual buzz words and predictions around disruption, next generation learning, AI, work 4.0 . . . One of the presentations gave an overview of the future – 2037  to be precise.

A world where every child will be chipped to provide them with “vital’ health monitoring.  A world where everyone will have a digital assistant to help with all those annoying little things we have to do ourselves just now. Like remembering “stuff”,  going to a library, activate communication channels. These personal assistants will be able  take you on a virtual shopping spree, link to your personal 3-d printer to print out your clothing selection, book your shared, automated transport device,  know where you are, what you are doing all the time, share your the data with . . .  well actually that part wasn’t mentioned.

As this very positive spin of a potential future was being shared, my mind was drawn to Philip Pullman’s notion of daemons in the His Dark Materials Trilogy.  In that world everyone has a daemon, which takes the form of an animal and is constantly by your side. These daemons are more like a soul or conscious, a vital part of everyone.

In contrast, what I was hearing was more like an oppressive digital daemon. One that would dictate where you could go, what you could wear, what health care you could get based on your demographic and economic profile.  One that was controlled by “the market”,  one that produced a homogenised population. One that is being developed and spoken about with such certainty from a very western/global North outlook.  Education in this context, is personalised through these digital daemons and their AI capabilities. All  based on a foundation of already biased algorithms.  What’s good for your data profile might not necessarily be good for your soul. I suspect some futurologists are not too concerned with our souls.

But as was pointed out in a paper released by Contact North last week, prediction is very difficult, particularly if it is about the future.

In their “Big Shifts are coming! Looking back from 2035, a day in the life of a student in 2035, building the roadmap to 2035 — let’s do some imaginative scenario building” paper there are another round of predictions for the future, and some background to why organisations should plan for the future and it includes ideas about how to scenario build.   There’s also a day in the life of a student, this time in 2035.

Again a vision of perfect harmony between humans, teaching bots, personalised and collaborative (global)  learning. All underpinned by new, harmonious relationships between educational provider and big business. All beautifully data driven, and no mention of who owns the data or how much this all costs.

To be fair there are a number of alternative scenarios given in the paper too,  such as The Eloi and Morlock, The Job-Stealing Robot Apocalypse and The Siege of academe. Not wanting to give away any spoilers, I won’t mention how they turn out for today’s liberal academic.  

It’s easy for me to take a light weight critique at this. I couldn’t have written this paper. I do like the way that it explains approaches to scenario building and looks at the recent past too.  I enjoyed reading the scenarios. They’ve provoked a response which is the point of any paper like this, so I thank the author for that.

The paper gives a list of things to think about to start future scenario planning,  including

  • What will the future of work look like in an AI-fueled world?

  • How might the impacts and risks of climate change shape educationinvestments and thoughts about campuses and student safety?

  • How might heightened longevity affect workforce expectations andtraining needs?

  • How will technology’s bene ts be divided and/or shared?

At least one other exogenous variable stands out:

        • The state of global security.
 . .  . The need for physical security has slowed the march of globalization. Subsequent malware attacks and the use of social networks to spoof elections throughout the world have added cybersecurity to our concerns. For post-secondary education’s future, the key exogenous planning question might be: will nations encourage the movement of people, goods, jobs and ideas across borders, or will mobility be defined by fear and terror and truncated by walls, barriers and concerns over security?

In the UK, Brexit is causing increasing anxiety around this last point.  This isn’t the future  – this  is now. Our current government doesn’t seem to have any clear ideas around this. Our university leaders are rightly concerned.  In fact they are probably more vocal around this issue than anything else just now.  Technology is one part of a wider solution but being able to physically work and travel to in and to other countries is still a vital part of not just academic, but every day life.

So where are the scenarios from the refugees fleeing from Syria? from Africa? Where and how are they going to access (pay) for their digital assistants? Where is the infrastructure going to come from for them to participate in the future? What about China? Who really owns our power infrastructures our technologically driven present and future relies on? Who is actually paying for all of this?

Where are the scenarios where a vital part of the education system is about educating everyone about data, data ownership (could Malta and their plans for blockchain be shaping the future here ); the ethics of the use data, opening up and exploring algorithms?

The paper ends with a quote from William Gibson “the future is already here, it’s just not very evenly distributed.”  As I look around the world and see the rising advance of right wing politics, global business, misogyny, all based on the reactions of “the market”, I fear for all our futures unless we can voice alternative scenarios based on the values we hold dear.  Ones where notions of the commons of education are central to developing  equality,  understanding and embracing diversity.   Most importantly a future that isn’t owned and controlled  by a few big businesses and complicit political structures and is human not data centric.

Leave a comment