Stratfor, a RANE company, regularly supports educational projects with high schools and colleges. In the 2020 spring semester, we presented five challenges to University of Texas at Austin undergraduate STEM students, in cooperation with the university's Inventors Program and LBJ School of Public Affairs Intelligence Studies Project. The program provides early STEM students with real-world application challenges, drawing on their science studies, but requiring a multidisciplinary approach to define, assess and provide solutions.
Student teams we worked with addressed the challenges of food security, water security in rural areas (shifting water patterns can dramatically affect jobs, food security and migration), devising a system to identify in the scientific literature early warning of intent or capability to develop biological weapons and investigating energy and infrastructure requirements and impacts for cities seeking to shift to more electric vehicles.
Each topic linked issues that Stratfor deals with (food security; terrorism; state-to-state conflict; environmental impacts on economic, social and political systems; energy supplies), placed them in the context of national security and policy, and asked for concrete solutions.
One of the first challenges for the students was to better define the question, identify the subcomponents, and then explore a solution that in the end would likely be part of a matrix, rather than a single all-encompassing answer. This draws on a basic intelligence cycle, and in devising solutions students faced a set of factors well outside their traditional area of study. As such, students engaged in multidisciplinary thinking, having to draw on several fields to formulate a solution and anticipate new challenges raised.
The challenge, Red-Flagging Bio-Warfare Risks through Scientific Research Papers, asked students to establish a system to detect risky research in the scientific literature that might suggest intent or capability toward the development of biological weapons.
Kush Patel, a rising sophomore at the University of Texas at Austin who participated, said of the experience:
"I'm interested in the intersection of science and policy so this was a perfect fit."
The background: Advances in tools for biological manipulation (for example CRISPR and successor methods) are bringing biological engineering into the ease and reach of numerous academic institutes. While much research is for "good," the same capabilities can be used for nefarious purposes. The massive increase in scientific literature in multiple languages, and database collation and storage of that literature, may provide a space for "early warning" of risky or dangerous applications of these tools. Machine learning and AI may be tools to sort through scientific literature to red-flag areas for deeper investigation.
Patel says he and team members Nicole Garza, Nehaa Dambala and Nithin Valetti worked to define a red flag:
"So if some scientist is studying the flu for some reason but the results of their study might be a bit dangerous to publish ... Let's say some terrorist could create a new, better version of that flu because of that publication. So Rodger asked 'What's a way to find the type of research that you would not want published out there?'"
The team proposed a solution using a framework created by the National Academy of Science. Then they decided to modify that framework using more quantitative analysis: creating 20 questions and using a scale of 0-5, with points based on responses to each question. And then they added up all the points and used a scale of 1-100 to identify relative levels of risk.
Patel said he was inspired by the challenge:
"I decided to apply for a fellowship… and received a grant to continue this kind of research next summer."
Another challenge called for a different group of students to develop a system to detect and/or disable biological disease agents on human or animal vectors before they are able to infect food stocks.
CONTEXT: Food animal diseases, such as the various swine flu strains and mad cow disease, periodically threaten food security in the United States and elsewhere. Such diseases can spread via human agency (movement of infected material accidentally or intentionally) or through contact with wild animals (potentially how swine flu recently spread across the DMZ in Korea). The use of human or animal vectors provides a path for terrorist or competing state actors to undermine domestic food production and security.
Aileen Hu, who worked on this problem with Henry Trentham, Megan Rae and Catherine Valdez, said:
"The biggest takeaway from this project is how difficult policy can be to make. It's very bureaucratic and there can never be a policy that will have no negative effects."
As a former STEM student myself, I recognize the value of both thinking outside a single field and gaining experience working collaboratively across research areas. Foundational in our geopolitical process at Stratfor is a concept put forward by Sir Halford Mackinder in the late 1800s, when he was defending geography as a field of study:
"The more we specialize the more room and the more necessity is there for students whose constant aim it shall be to bring out the relations of the special subjects."
In short, while there is a need for specialization, there is also a need to stop and synthesize the discrete and disparate fields to see how they fit together. Animal diseases are a biological issue, but also an economic issue, a sociological issue; it relates to terrorism and interstate conflict, to urbanization and transportation, to detection technologies. A single approach may overlook potential solutions and fail to recognize unintended impacts.
Working with students and universities is one of the pleasures of the job, whether it is the Inventors Program, sponsoring a capstone project or providing geopolitical analysis training.