You walk into your shower and find a spider. You are not an arachnologist. You do, however, know that any one of the four following options is possible:

A. The spider is real and harmless.

B. The spider is real and venomous.

C. Your next-door neighbor, who dislikes your noisy dog, has turned her personal surveillance spider (purchased from “Drones ‘R Us” for $49.95) loose and is monitoring it on her iPhone from her seat at a sports bar downtown. The pictures of you, undressed, are now being relayed on several screens during the break of an NFL game, to the mirth of the entire neighborhood.

D. Your business competitor has sent his drone assassin spider, which he purchased from a bankrupt military contractor, to take you out. Upon spotting you with its sensors, and before you have any time to weigh your options, the spider shoots an infinitesimal needle into a vein in your left leg and takes a blood sample. As you beat a retreat out of the shower, your blood sample is being run on your competitor’s smartphone for a DNA match. The match is made against a DNA sample of you that is already on file at EVER.com (Everything about Everybody), an international DNA database (with access available for $179.99). Once the match is confirmed (a matter of seconds), the assassin spider outruns you with incredible speed into your bedroom, pausing only long enough to dart another needle, this time containing a lethal dose of a synthetically produced, undetectable poison, into your bloodstream. Your assassin, who is on a summer vacation in Provence, then withdraws his spider under the crack of your bedroom door and out of the house and presses its self-destruct button. No trace of the spider or the poison it carried will ever be found by law enforcement authorities.

Gabriella Blum from the essay “Invisible Threats


Does this scenario sound like science fiction? According to Harvard Law Professor Gabriella Blum LL.M. ’01 S.J.D. ’03, “It’s the future.” In her essay “Invisible Threats,” Blum builds on themes from a joint book project with Benjamin Wittes of the Brookings Institution.

Blum’s interest in those themes developed when targeted killings involving drones began in the early 2000s. Unlike military aircraft, that kind of technology is available to everyone, she noted. “Realistically speaking, there is just no way to contain the technology,” she said. “Making the drones lethal is just the next step. … It’s going to be very hard to control.”

Her research has focused on how the proliferation of such technology will affect society. In the essay, she contends that “three features of new weapons technology—proliferation, remoteness and concealment—make violence more likely.” The nature of those threats, and how they should be addressed, became more visible during a roundtable discussion convened by Blum at HLS in February. Called “Technology and the Future of Violence,” it brought together representatives from the military—such as Gen. James Cartwright, former vice chair of the Joint Chiefs of Staff, and Richard Danzig, former secretary of the Navy—with experts from the fields of law, security, finance, and science and technology.

Although the possibly lethal spider scenario may seem futuristic, the discussion raised concerns about threats that are present in the world today and those we may face soon, particularly involving cyber and biological attacks. These threats are magnified by the fact that individuals and small groups may have the capacity to render more harm than ever before and from more remote locations. With available computer technology, nonstate actors could, for example, unleash viruses to disrupt faraway industrial systems and cause widespread damage. The capacity for biological terror has also increased, with greater access to deadly materials available outside regulated facilities. Even the cosmetics industry was cited as a possible avenue for biological weapons.

Of course, governments need to address these threats. Some fear current efforts may be insufficient and the law is not keeping up with fast-paced technological innovation. Sessions covered options for both domestic and international responses, including ways to improve international cooperation and whether surveillance rules are outdated. As one strategy to enhance defense, participants pointed to the possibility of governments and private companies sharing information about cyberthreats, although such efforts may raise concerns about civil liberties and privacy issues.

The proceedings will be catalogued in a white paper written by Susan Hennessey ’13, an editor on the Harvard National Security Journal, one of several students involved in the invitation-only roundtable. She said the event showed the importance of different people from diverse disciplines working together to address the problem.

“We need to develop response frameworks that anticipate technologies that don’t exist yet,” Hennessey said. “We’re going to have to change the way we think about these things.”

As for Blum, she became conditioned as a native of Israel to expect threats. But the stakes could become even greater, with increased vulnerability no matter where someone may live. If something like the spider story came true, and anyone could kill anyone else anywhere without getting caught, it would essentially be the end of civilization as we know it, she said. Yet at the same time, technology can provide immeasurable benefits to society. In the end, said Blum, technology “will come with a certain level of threat. We’ll have to meet some of it, and get used to what we can’t handle and just accept that as a fact of life.”