Andruil is Undermining American Security for Profit
I first heard about Andruil from some direct messages I received on LinkedIn from recruiters looking to hire experienced software engineers. My current employer, JPL, grows out of the aerospace and defense industry that is so prominent here in Southern California. Therefore, most people leaving JPL go to one of the other big defense contractors such as Lockheed, Boeing, or Raytheon. So when I heard about this unknown startup in this hyper-competitive industry, my interest was piqued.
Andruil is a relatively new startup that is making waves in the software and defense industries. Run by the ex-Occulus founder Palmer Lucky, Andruil’s ethos is an unflinching embrace of the application of Artificial Intelligence (AI) to all sorts of military applications. Only a few years ago it was considered taboo for silicon valley venuture-backed startups to openly partner with the DoD. Today, Andruil is bucking this trend in trying to compete for top government contracts with the likes of Boeing, Lockheed Martin, and General Dynamics.
But Andruil isn’t cut from the same cloth as those old-school defense contractors. Whereas Boeing and Lockheed’s bread is buttered by contracts to build the next F-22 fighter jet or super heavy lift aircraft, Andruil is lazer focused on selling autonomous systems for military and security applications. Andruil’s publicly known products include an autonomous sentry tower now deployed along the American-Mexican border, autonomous submersibles, and more. Andruil criticizes both its competition and American policy makers for being asleep at the wheel by not embracing more AI in military applications. They claim that not developing and deploying these technologies is seeding the technological upper hand in future conflicts to states like China, Russia, and others that might not have the same ethical and practical qualms about developing and using such technologies in their militaries.
But what are people’s reservations about using more autonomous technologies in military applications? The reservations are many and different according to who you ask. According to Lucky, in an interview with Bloomberg, he claims that at least part of this reservation is based around employees of consumer technology companies not being told upfront that they would be working on military applications. Thus, these employees would see the likes of employers such as Google or Amazon openly embracing these technologies as a “bait and switch” and a betrayal of the promise made to them when they were hired. Other objections stem from nebulous moral concerns about purity and not wanting to be associated with the military-industrial complex in anyway.
Putting aside the merit of these reservations, one thing that ties them together is that they are typically based in some moral, ethical, or emotional appeal. Working on projects that kill people is evil, full stop. If this was the extent of objections to Andruil’s crusade, then I would have to join in with Lucky in criticizing these hypothetical objections of a certain naivety. All of us living in American society benefit and have our very livelihood constituted by acts of violence, no matter how abstracted away those acts remain. From teachers, to cops, to software engineers, to film directors, if you go back far enough, violence played a pivotal role in creating space for a civil society that makes all of our present-day lives possible. This truth, no matter how inconvenient it may be, gets in the way of any appeal to not “participating” in acts of violence, by not working directly on projects that enact those violent acts on people.
But what if there was a deeper, more grounded, much more practical reason that applying autonomous systems to military industries is and will always be a misguided venture? What if there were grounds to believe that Andruil’s tactical approach actually directly undermines American national security? Well, that’s exactly what is happening. The more autonomous, disconnected, and inhuman we make our military, the more unsafe Americans will feel and become.
What value does Andruil claim they are providing to the DoD? Largely, they are claiming that they are working to modernize the aging American national security infrastructure in order to better guarantee the safety and security of the United States and its citizens.
What is Andruil actually selling to the DoD? Yes, they are selling sophisticated hardware and software systems capable of performing all sorts of different functions in different scenarios. But what is the fundamental value proposition they are offering the military? They are promising to make war cheaper. The reason there’s so much drive for automation across every major industry, not just defense, is because the promise of automation is that it will reduce costs. With more and more automation, fewer people need to be on the payroll to complete the same amount of work. But automation doesn’t merely reduce financial costs, it promises to minimize all sorts of costs. Complexity costs, cognitive costs, time costs, etc.
Once upon a time in American society, there was a deep connection between the general public and the military. Prior to today’s professional, all volunteer fighting force, we once had the idea of a citizen solider. This was the idea that each and every citizen had a responsibility to contribute to and even sacrifice for the common defense. This was the foundation of selective service or the draft. But this all changes after Vietnam when we effectively abolished the draft and ushered in today’s age of the few doing the fighting for the many.
In the forgone age of the citizen soldier, pretty much every person in American society was either in the military, a veteran, or had a close family member or friend connected with the armed forces. This had the consequence that when the United States went to war, American society wasn’t just aware of it intellectually, it felt it viscerally. When war was declared everyone understood, on a deep emotional level, that there closest loved ones could end up losing their lives in the execution of that war. This is in radically stark contrast to today. It is not an accident that the American war in Afghanistan was the longest continual war the United States has been involved in. It’s specifically because American didn’t feel this war in it’s bones like it had with so many in the past.
This connection between American society and the armed forces had both moral and practical dimensions to it. Society was able to shoulder the moral responsibility of conducting war in a much more responsible and informed manor. Additionally, this knowledge that your personal friends and family might die fighting in a war worked as an emotional bulwark against the country getting bogged down in unending conflicts abroad.
So then, what is Andruil’s effect on the relationship between the American people and the armed forces? Andruil effectively accelerates this alienation between the two, further driving a wedge between those that reap the benefits of American empire and war, like myself, and those that actually enact that violence on our behalf.
This further hides the true cost of war. The cost in lives, tax dollars, and mental torment experienced by our soldiers and drone operators. The more these costs are hidden and outsourced, the less skin we have in the game, the easier it is for us to enact violence across the globe. This wanton use of violence for our own ends only emboldens our enemies and further alienates us from the rest of the global community. In the end, Andruil’s promise of increased national security is an outright lie. The more we move towards automation and AI in the conduct of war, the less secure we will become.