We live at a time when the very fabric of objective reality is under attack; slowly eroding with each passing election, IPO, political scandal, and 24-hr news cycle. Information, which was once scarce and cherished (regardless of quantity and form) has quickly grown so abundant and is being produced at the expense of truth with such reckless and profit-driven abandon, that we humans (if we haven’t already) are on the verge of losing control entirely.

We can no longer discern fact from fake. We don’t know what or who to trust. Increasingly, our senses are being numbed by small screens and infinite feeds full of falsities and digitized dopamine. So, as I sit and ponder this new (un)reality having recently attended the EDUCAUSE Annual Conference—a premier U.S. education event—I can’t help but wonder whether such communities, the ones I assume I should turn to for clarity and truth in troubled times, are surfacing signals amongst the noise or just leading me further down the deepfake rabbit hole.

Mission Critical

The mission of EDUCAUSE is to, “advance higher education through the use of information technology.” It’s this word ‘advance’ that I find problematic and cause for pause. Advance to where? To what end? Simply advancing, especially when fueled by largely unregulated and corporatized IT, is a dangerous and irresponsible singular mission in an age where data has become the most abundant and valuable resource on earth. For massive organizations like EDUCAUSE, the irony and contradictions have become increasingly palpable and the lack of honest acknowledgment has only furthered my angst and unease.

On one hand, you do find pockets (albeit minority ones) of skepticism and criticism from courageous practitioners and scholars like Michael Caulfield, Chris Gilliard, and Autumm Caines/Erin Glass; long-time educators who sense the existential dangers that accompany a data-race to the bottom, and as opposed to silent acquiescence, have been vigilant in offering a critical counterbalance to the prevailing narrative.  

On the other hand, you also find overlit, logo-littered vendor halls packed full of devious smiles from data-driven IT giants like Amazon, Google, Microsoft, and many others looking to “seal a deal” and extract one more consumer data point—which downstream, all-too-often, means from students. A quick scan of the EDUCAUSE conference hashtag on Twitter quickly uncovers the other side of the contradiction coin; a stream of promotional punchlines intended to satisfy my (or perhaps my CIO’s) insatiable hunger for the latest SaaS solution, API, algorithm, or AI-bot (i.e. more data, more noise).

With #AI and analytics, this #HigherEd institution was able to look for patterns in students’ behavior. Discover their findings using GCPcloud. #EDU19
Leverage the ⚡ #power ⚡ of data to drive innovation & student success, accelerate research and improve campus life. #EDU19

Do we want more data or less? Is education about critical inquiry or corporate interests? As you might expect, I think it depends. While the benefits that can surface from events like EDUCAUSE are unquestionable (e.g. novel insights, lasting relationships, and bottomless coffee), in aggregate, I question whether the contradictions and cognitive dissonance at scale are helping to turn the tides and mend the fraying fabric of our obscured reality.

Toward New Signals; New Symbols

So, while this paints a potentially grim picture of a shared and inescapable deepfake reality, I do think there are signs of a hybrid future where big-data, privacy, truth, and trust can elegantly intersect and exist in harmony. Just as data has proliferated exponentially, so too has a suite of parallel innovations in cryptography, distributed networks, and new infrastructure-layer protocols at the root of the internet itself. The invention and rapid evolution of blockchains, oracles, distributed identity, and differential privacy protocols, zero-knowledge proofs, etc. have cracked open a door to a previously unimaginable future; one in which IT and data are used to expose objective reality as opposed to obfuscate it.

Blockcerts, in partnership with the MIT Media Lab and Learning Machine, has developed an open standard for creating, issuing, viewing, and verifying blockchain-based certificates; digital records registered on a blockchain, cryptographically signed, tamper-proof, and shareable. This project has led to a wave of global innovation that gives individuals the capacity to possess and share their own official records.

The Sovrin Foundation, an open source project and global public utility for self-sovereign identity, is pioneering a suite of new digital identity standards. Using the latest identity standards called Decentralized Identifiers (DIDs), the Sovrin Network allows for digital credentials to be privately issued, controlled, managed, and shared using an exciting new security standard called Zero Knowledge Proofs (ZKPs). This project opens new doors to a range of intersectional projects and networks and allows us to reimagine digital identity in the 21st Century.

Learning Economy, a non-profit organization with a mission to accelerate the world’s transition to 21st century education and workplace infrastructures, is designing a new protocol leveraging many of these nascent innovations to connect the fragmented learning ecosystem into a unified value chain. This would allow schools, employers, researchers, and any number of institutions to translate information between multiple standards and systems without changing behavior or sacrificing proprietary data or privacy.

This is just the tip of the innovation iceberg in a new deep-truth reality that is here today. It’s possible right now. We just need the collective courage and will to band together and make it so.

If, as a species, we intend to emerge from the current era of data gluttony and surveillance capitalism, we need to take a longer view and be thoughtful about the systems we’re building for future generations. The institutions that comprise democracy are not static and predestined. They bend, morph, meander, and if we’re not careful, they can break. Let’s find new signals among the deepfake noise.

Taylor Kendal
Taylor is a Diplomatic Courier contributor focused on Web3, privacy/digital ethics, bridging cultures of entrepreneurship and education, infusing agility and intellectual honesty into bureaucracy, and exploring the future of education on the blockchain.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.