So you can’t wait for a self-driving car to take away the drudgery of driving? Me neither! But consider this scenario, recently posed by neuroscientist Gary Marcus: Your car is on a narrow bridge when a school bus veers into your lane. Should your self-driving car plunge off the bridge, sacrificing your life to save those of the children?

Obviously, you won’t make the call. You’ve ceded that decision to the car’s algorithms. You better hope that you agree with its choice.

This is a dramatic dilemma, but it’s not a completely unusual one. The truth is, our tools increasingly guide and shape our behavior or even make decisions on our behalf. By taking human decision-making out of the equation, we’re slowly stripping away deliberation, moments where we reflect on the morality of our actions.

Not all of these situations are so life-and-death. Some are quite prosaic, like the welter of new gadgets that try to “nudge” us into better behavior. In his new book To Save Everything, Click Here, Evgeny Morozov casts a skeptical eye on this stuff. He tells about a recent example he’s seen: a smart fork that monitors how much you’re eating and warns you if you’re overdoing it.

Fun and useful, you might argue. But for Morozov, tools like the fork reduce your incentive to think about how you’re eating, and the deeper political questions of why today’s food ecosystem is so enfattening. “Instead of regulating the food industry to make food healthier,” Morozov says, “we’re giving people smart forks.”

Or as Evan Selinger, a philosopher at Rochester Institute of Technology, puts it, tools that make hard things easy can make us less likely to tolerate things that are hard. Outsourcing our self-control to “digital willpower” has consequences: Use Siri constantly to get instant information and you can erode your ability to be patient in the face of incomplete answers, a crucial civic virtue.

Efficiency isn’t always a good thing. Tech lets us do things more easily. But this can mean doing them less reflectively too. Efficiency isn’t always bad. But sometimes tools should do the opposite—they should introduce friction. For example, new parking meters reset when you drive away, so another driver can’t draft off of any remaining time. The city makes more money, obviously, but that design also compels your behavior. What if a “smart” meter instead offered you a choice: Gift remaining time to the next driver or to the city? This would foreground the tiny moral trade-offs of daily life—city versus citizen.

Or consider the Caterpillar, a power strip that detects when a plugged-in device is in standby mode. Instead of turning off the device, a traditional efficiency move, the Caterpillar leaves it on, but starts writhing. The point is to draw attention to your power usage, force you to turn it off yourself and meditate on why you’re using so much.

These are kind of crazy, of course. They’re not tools that solve problems. They’re tools to make you think about problems—which is precisely the point.•

Clive Thompson is a contributing writer at The New York Times Magazine. In 2002, he was a Knight Science-Journalism Fellow at MIT.