
The internet of things is the most human of digital identity technologies. Unlike cloud technologies, we touch, hear, smell, and see things. They are real like kittens and roads. Damian Glover wrote yesterday about institutional blockers to IoT. Our interviews revealed personal resistance to IoT adoption, too.
Cultural blockers of IoT are classic fears in new clothes.
Governments worry about public safety, cyberwarfare, and consumer protection. Laws are coming online this year and next to mandate IoT security measures to new security standards.
But we were surprised that IoT is being felt as the introduction of surveillance.
- Clinical technology as workplace surveillance. Hospital providers talk about their frustration with connected technologies because it feels like their every motion is being monitored and tracked, used by bosses to evaluate their speed and cost efficiency.
- Civic technologies as government surveillance. From Oakland’s corner traffic cameras leading to mass rallies to Boston Police tests and NYPD robot dogs, IoT is deep in the creepy depths of the uncanny valley.
- Consumer technology as commercial surveillance. Alexa, Google, and Apple know too much about you and use it to sell adjacent services.
Why these feelings?
- Devices project power into physical spaces where people live and work.
- Devices are opaque: they hide what happens downstream with device data and upstream with device control.
- Devices don’t put nearby-humans at the center of experience. “User experience” isn’t for them but designed by and for absent institutions. When exactly did Amazon Alexa last ask for your consent when you walked in a room? When did Google Nest ask for permission to send your picture to the cloud? What happened to the gigabytes of data produced during your colonoscopy? Who is looking and listening? What bots are judging your behavior or speech?

Ignore that man behind the curtain.
Toto: Look!
WiderPoV: Devices are feared and distrusted as proxies for our distrust of the people and organizations behind them.
Why are we feeling it now?
- Now, because the human-device experiences are more common, more connected, more intense.
- Now, because surveillance effects are better understood and more widely feared.
Slapping a friendly mask on devices isn’t going to work. So…
Ecosystem Agenda:
- Research via social sciences to find how design fosters trust and understanding through the new Human-Computer Interfaces.
- Hire human research specialists like anthropologists, behavioral psychologists, industrial designers.
- Explore user experience of biometrics, identity theater, labeling for disclosure, ambient and pervasive identity of things, identity of avatars and virtual things.
- Test approaches broadly, across regions, cultures, generations.
- Challenge consent assumptions to form new consent models that reflect emerging technical realities and diverse/conflicting public policies.
- Pilot frequently to learn. Share findings.
- Drive toward ecosystem-wide problem clarity.
- This is a collective problem with collective solutions. Engage allies, labor, supply chains, regulators, patients and other end users. The sooner everyone agrees on the problem, the sooner we work together.
- Climb the Higher Assurance Digital Identity of Things Maturity Model.
- Trusted interactions start with humans believing what they are told. Digital identity underpins that whole relationship.
- Your enterprise can take stock of your deployed quality and levels of assurance, and that of their partners. Climb from there.
If that goes well, put Wider On Call.
Leave a Reply