Login/Logout

*
*  

"I greatly appreciate your very swift response, and your organization's work in general. It's a terrific source of authoritative information."

– Lisa Beyer
Bloomberg News
August 27, 2018
Getting to Know Ryan Gariepy
Share this

Interviewed by Jefferson Morley

Ryan Gariepy, shown in this 2014 photo, is co-founder of Clearpath Robotics. (Courtesy of Ryan Gariepy)“Getting to Know” is an occasional series that introduces Arms Control Today readers to interesting people active in the world of arms control.

Many people are worried about the prospect of lethal autonomous weapons systems, popularly known as “killer robots,” entering the battlefields of the world’s many armed conflicts. But not many of the critics of lethal autonomous systems know how to design and build high-tech robots. Enter Ryan Gariepy, the 28-year-old co-founder and chief technical officer of Clearpath Robotics, based in Kitchener, Canada. Last year, the fast-growing firm, with a workforce of 100, established a company policy of not accepting any work related to the building of robotics for lethal autonomous systems. In June, Gariepy was among 1,000-plus people from the world of artificial intelligence research and design, including famed physicist Stephen Hawking and inventor Elon Musk, who signed a statement calling for a ban on lethal autonomous systems. Gariepy is perhaps the first high-tech entrepreneur to make that position part of his business plan.

Arms Control Today caught up with Gariepy by telephone at his office in Kitchener. The interview, conducted by Jefferson Morley, has been edited for length and clarity.

Where did Clearpath come from?

From the University of Waterloo, a very strong engineering and math school in Ontario. Three of us were on a student robotics team, and we just saw a lot of potential in the things we were building. So after we graduated, we started a company.

It says on your website “The Future is Autonomous.” What are you getting at there?

The promise of unmanned systems, or autonomous systems or artificial intelligence systems, is to really free people from what is known, in the robotics community, as “dull, dirty, and dangerous” jobs.

Is the future of warfare autonomous?

We recognize the value that unmanned systems can bring to service members and keeping people out of harm’s way. The question is, can we not make sure that there is always an element of human involvement when a system decides to take lethal force?

How did you get interested in this issue?

We are always out there trying to extol the benefits of autonomous systems to companies and governments worldwide. But sometimes you run into a use case where you say, “Maybe this isn’t the best place for robotics.” It might not make business sense; or in this case, it may not be a humane thing to do, and it may be an incredibly risky way to deploy autonomous systems. Just like we look at the impact of our systems in a positive aspect, we also look at the negative aspect.

We saw the issue coming up more and more. In 2013 there was the “Losing Humanity” report by Human Rights Watch. Previous to that, you had Peter W. Singer’s book “Wired for War.” Before that, you had Ron Arkin’s textbook, “Governing Lethal Behavior in Autonomous Robots.”

We are also seeing that robotics research is no longer being primarily driven by the military. People are approaching the issue as if the governments have the most say on what research is done. But robotics these days is quickly being dominated by private industry.

So if governments do not recognize the potential, good or bad, for this technology, then you might see that the technology is going to exist in a short time frame, certainly shorter than the international community can act on.

Does Clearpath have contracts with the U.S. or Canadian military?

We do work with the Canadian government and with the research arms of the U.S. Navy, Army, and Air Force.

What do you think of the term “killer robots”?

The term certainly brought the public’s eyes to the issue. The impact of that should not be underestimated. Personally, I prefer to say “lethal autonomous weapons systems,” but that’s because I’m an engineer.

What has been the reaction to Clearpath’s position on lethal autonomous systems?

Many of the skilled people, the engineers applying for jobs with us, have expressed interest in the issue, and we weren’t expecting that. We are expecting that the company might have a partial loss in military contracts at some point. We didn’t expect the level of public support we’ve gotten.