.
What do the successful military officers of today think are the most important issues impacting global affairs? Serving military officers answer here in a new series of features in collaboration with Military Leadership Circle (MLC). Educating and developing a 21st century workforce is the most important challenge facing those who wish to influence global affairs now and in the near future. It is also a daunting challenge that will require leaders to adapt cultural norms to keep up with and master the technological developments of our era.  Such adaptation will involve overcoming the oddly stubborn reality that there is still a reliance on Industrial Age culture despite the fact that we long ago entered the Information Age. An instructive case in this regard is the phenomenon of Artificial Intelligence (AI). Both the private and public sectors are increasingly seeking out new applications for AI. The Department of Defense (DOD), for instance, just formed the Joint Artificial Intelligence Center (JAIC) on June 27th, 2018 to permeate DOD with this powerful tool. Finding the right balance of AI augmentation and process replacement, however, will require a shift in military education and culture.  It will be no small shift, just as those that took place in centuries past were not: the transformations of the Industrial Age from the 18th to 19th centuries and the Information Age from the 19th to the 20th brought painful adjustment. But just as those changes defined their times, the shift from a 20th century modern world to the 21st century post-modern one will be dominated by AI. Two examples of industries still in flux—transport and assistant services—provide valuable insights into just how this shift may fully happen. Just over 100 years ago, Ford Motor Company rolled out their first Model T, then fielded their first cargo truck in 1917. The cultural norms and supply chains shaped around horse and steam power resisted the implications of the internal combustion engine’s introduction to transport, slowing but not stopping changes that were inevitable. A similar paradigm shift is taking place today, equally as productive for those who adapt to it and as vexing for those who do not. Tesla will produce their Model 3 this year; it will likely be the Model T of our time, undeniably shifting the transport market in profound ways, if not supplanting current mobility models entirely. Able to drive themselves, (augmented at first but then ultimately altogether driverless), such models of automobile (and others of the same ilk) will lead the way in how AI- assisted transport will reforge supply chains, transform fuel systems and markets, and alter labor services in connected industries. Even if it takes a decade or two before humans are completely removed from the wheel, it is nevertheless true that we are now training and directing young people into jobs (drivers, supporting mechanics and logisticians) that will not lead to sustainable long-term careers. Assuming a 30-40-year career for someone entering the workforce, we are now looking at not just the last generation of truck drivers and transport companies, but at individuals already placed into those jobs and companies who will not have applications for their skills by the time they reach their prime earning years. Indeed, the transport industry and its participating unions are forging new lobbying groups to buffer against these developments. Some companies–such as Ford, ironically—will and have already followed Tesla (in other words, whether Tesla survives or not, the shift in the industry will occur). Other transport companies will inevitably inherit the rewards of Tesla’s risk. The United States must shift its educational models to embrace smart transport design, management, and accompanying regulation. Personal services, too, are seeing a major and seemingly unstoppable shift. AI assistants like Siri may first have seemed like gimmicky products. Yet assistant services have rapidly proven their utility and appeal in the new age. Indeed, this usefulness has been found not just on the user end. Machine learning, which shapes AI, requires massive inputs to find patterns in metadata, and assistant services provide machine learning algorithms a steady input stream. This feeds a user-metadata loop that increases viability on both ends. In recent holiday seasons, products like Amazon Alexa and Google Home have reached market operationalization. Many people have given these devices to family and friends, particularly as life aids for the elderly. (Late night comedy shows even remarked upon this trend with a parody Echo Silver edition.) As the sample size ‘n’ grows, AI systems’ decision making and search answering ability grows to the point of anticipating a request based on previous inputs and patterns. While it maybe many decades before AI gets to the point of creativity or full service industry replacement, it is sure that, within a decade, AI assistants will at least saturate markets and culture to handle data optimization, much in the same way the smart phone did. These two simple examples depict the extreme shifts that must be taken into account in educational and training models. We must more quickly integrate into our schools and training programs not only an appreciation for how to use AI, but also an understanding of how learning algorithms work. We must not only adapt our infrastructure for self-driving vehicles and AI-leveraged office spaces, but anticipate the displaced workforces of teamsters and reception staff that will come with them. It is telling that, as of yet, we do not have a version of Asimov’s four laws of robotics for our current revolutionary technologies.  This is, perhaps, an outgrowth of a lack in foresight, planning, and education suited to the current age. Humans can still create data, shape data, and discern AI outputs. Although AI will assist in these tasks, there will always be a need for people to determine initial moral frameworks and associated policies. Although a decade ago, some think tanks thought that such frameworks would not be necessary, the astounding amount of progress in machine learning that has since taken place should occasion some reconsideration. Not only do we now need laws, policies, and strategies tailored to current realities, but we also need educational systems and programs capable of preparing individuals to craft, implement, and adjust such facets of culture and politics as necessary. We are already facing the serious implications of this issue:  moral quandaries abound in terms of AI integration into lethal systems, so much so that some private sector companies have removed themselves from defense projects. Controversial unmanned drones have come to dominate military reconnaissance, yet still require some remote human operators and analysts. In this striking present-day example, one sees just how the military would benefit from developing robust operator and analyst education for AI strategy, ethics, and design (much like civilian counterparts who work with them would also benefit). Big tech companies are unable to hire the talent they need because the present education system is not teaching what’s needed. We should not wait and react to the coming decade’s change. We must adapt to AI now, because it is the most important thing we can do to be effective in the coming age. About the author: David Escobar is a Lieutenant Colonel in the United States Army and a member of the Military Leadership Circle. The views expressed here are the author’s own and do not represent the positions of the Department of Defense, United States Army, or any government agency. More information on the Military Leadership Circle can be found at https://militaryleadershipcircle.com.  

The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

Update Education for Technological Realities, An Officer’s Perspective

August 9, 2018

What do the successful military officers of today think are the most important issues impacting global affairs? Serving military officers answer here in a new series of features in collaboration with Military Leadership Circle (MLC). Educating and developing a 21st century workforce is the most important challenge facing those who wish to influence global affairs now and in the near future. It is also a daunting challenge that will require leaders to adapt cultural norms to keep up with and master the technological developments of our era.  Such adaptation will involve overcoming the oddly stubborn reality that there is still a reliance on Industrial Age culture despite the fact that we long ago entered the Information Age. An instructive case in this regard is the phenomenon of Artificial Intelligence (AI). Both the private and public sectors are increasingly seeking out new applications for AI. The Department of Defense (DOD), for instance, just formed the Joint Artificial Intelligence Center (JAIC) on June 27th, 2018 to permeate DOD with this powerful tool. Finding the right balance of AI augmentation and process replacement, however, will require a shift in military education and culture.  It will be no small shift, just as those that took place in centuries past were not: the transformations of the Industrial Age from the 18th to 19th centuries and the Information Age from the 19th to the 20th brought painful adjustment. But just as those changes defined their times, the shift from a 20th century modern world to the 21st century post-modern one will be dominated by AI. Two examples of industries still in flux—transport and assistant services—provide valuable insights into just how this shift may fully happen. Just over 100 years ago, Ford Motor Company rolled out their first Model T, then fielded their first cargo truck in 1917. The cultural norms and supply chains shaped around horse and steam power resisted the implications of the internal combustion engine’s introduction to transport, slowing but not stopping changes that were inevitable. A similar paradigm shift is taking place today, equally as productive for those who adapt to it and as vexing for those who do not. Tesla will produce their Model 3 this year; it will likely be the Model T of our time, undeniably shifting the transport market in profound ways, if not supplanting current mobility models entirely. Able to drive themselves, (augmented at first but then ultimately altogether driverless), such models of automobile (and others of the same ilk) will lead the way in how AI- assisted transport will reforge supply chains, transform fuel systems and markets, and alter labor services in connected industries. Even if it takes a decade or two before humans are completely removed from the wheel, it is nevertheless true that we are now training and directing young people into jobs (drivers, supporting mechanics and logisticians) that will not lead to sustainable long-term careers. Assuming a 30-40-year career for someone entering the workforce, we are now looking at not just the last generation of truck drivers and transport companies, but at individuals already placed into those jobs and companies who will not have applications for their skills by the time they reach their prime earning years. Indeed, the transport industry and its participating unions are forging new lobbying groups to buffer against these developments. Some companies–such as Ford, ironically—will and have already followed Tesla (in other words, whether Tesla survives or not, the shift in the industry will occur). Other transport companies will inevitably inherit the rewards of Tesla’s risk. The United States must shift its educational models to embrace smart transport design, management, and accompanying regulation. Personal services, too, are seeing a major and seemingly unstoppable shift. AI assistants like Siri may first have seemed like gimmicky products. Yet assistant services have rapidly proven their utility and appeal in the new age. Indeed, this usefulness has been found not just on the user end. Machine learning, which shapes AI, requires massive inputs to find patterns in metadata, and assistant services provide machine learning algorithms a steady input stream. This feeds a user-metadata loop that increases viability on both ends. In recent holiday seasons, products like Amazon Alexa and Google Home have reached market operationalization. Many people have given these devices to family and friends, particularly as life aids for the elderly. (Late night comedy shows even remarked upon this trend with a parody Echo Silver edition.) As the sample size ‘n’ grows, AI systems’ decision making and search answering ability grows to the point of anticipating a request based on previous inputs and patterns. While it maybe many decades before AI gets to the point of creativity or full service industry replacement, it is sure that, within a decade, AI assistants will at least saturate markets and culture to handle data optimization, much in the same way the smart phone did. These two simple examples depict the extreme shifts that must be taken into account in educational and training models. We must more quickly integrate into our schools and training programs not only an appreciation for how to use AI, but also an understanding of how learning algorithms work. We must not only adapt our infrastructure for self-driving vehicles and AI-leveraged office spaces, but anticipate the displaced workforces of teamsters and reception staff that will come with them. It is telling that, as of yet, we do not have a version of Asimov’s four laws of robotics for our current revolutionary technologies.  This is, perhaps, an outgrowth of a lack in foresight, planning, and education suited to the current age. Humans can still create data, shape data, and discern AI outputs. Although AI will assist in these tasks, there will always be a need for people to determine initial moral frameworks and associated policies. Although a decade ago, some think tanks thought that such frameworks would not be necessary, the astounding amount of progress in machine learning that has since taken place should occasion some reconsideration. Not only do we now need laws, policies, and strategies tailored to current realities, but we also need educational systems and programs capable of preparing individuals to craft, implement, and adjust such facets of culture and politics as necessary. We are already facing the serious implications of this issue:  moral quandaries abound in terms of AI integration into lethal systems, so much so that some private sector companies have removed themselves from defense projects. Controversial unmanned drones have come to dominate military reconnaissance, yet still require some remote human operators and analysts. In this striking present-day example, one sees just how the military would benefit from developing robust operator and analyst education for AI strategy, ethics, and design (much like civilian counterparts who work with them would also benefit). Big tech companies are unable to hire the talent they need because the present education system is not teaching what’s needed. We should not wait and react to the coming decade’s change. We must adapt to AI now, because it is the most important thing we can do to be effective in the coming age. About the author: David Escobar is a Lieutenant Colonel in the United States Army and a member of the Military Leadership Circle. The views expressed here are the author’s own and do not represent the positions of the Department of Defense, United States Army, or any government agency. More information on the Military Leadership Circle can be found at https://militaryleadershipcircle.com.  

The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.