Data In A Fallen World

Something I wrote five years ago about the problems of so much data being collected by organizations that can't really be trusted.


Can one person be trusted with power over another?

In answering that question, a great deal hinges on the presuppositions we make about the trustworthiness of other human beings. Our generation is far from the first generation to be asking this question but we are grappling with it in the midst of technology-driven changes to the nature of what constitutes power. We're also in the midst of the loss of any common cultural consensus regarding the essential nature of man.

I was thinking about this on the treadmill this morning because my "cousin-in-law", Patrick Wilson, posted a link to this article about a creepy sort of wifi Barbie that records a child's voice and sends it for processing, I suppose, to the official Barbie data center. (If Barbie now has a data center, Silicon Valley has officially jumped the shark.)

As an engineer, I entirely understand that this design of wifi Barbie is driven by physics and the current limits of portable power. Sufficient power exists in a small package to record and forward the audio but the computational power to process the audio in meaningful ways far exceeds anything that could be put into a package a little child could lug around. So, from an engineering perspective, the motivations are probably entirely innocent. But Patrick raises a proper perspective in questioning whether Barbie's maker can be trusted with all of this recorded audio. The engineering choices may have been innocent but the happy side effect, from the toymaker's perspective, is that they will have unprecedented insight into your child's private play life.

The technological shift that is taking place, where power over human beings is concerned, is rooted in the massive scale of data collection enabled by the Internet. Most people do not understand that data carries with it unusual properties in that, as the scale of data grows, the qualitative aspects of what can be done with that data change dramatically. What I mean to say is that with more data you don't just have quantitatively more data, but the essence of the whole is fundamentally different at sufficient scale.

One of the earlier companies to recognize this was Google. They discovered that when data sets reached a certain size, you were suddenly able to solve very thorny problems with much simpler approaches to computation. Sufficient data presents the opportunity to implement solutions based on statistical analysis rather than actual understanding. That may sound strange, but consider the fact that Google translate doesn't actually translate anything but only performs word and phrase substitution based on the statistical frequency of the co-occurrence of phrases as discovered within an enormous body of pre-translated documents.

We are reaching the point where an entity's actual power is closely related to the amount of data it retains and its ability to turn that data into actionable information. Edward Snowden's revelations should, if nothing else, remove any illusions about how those with their hands on the levers of power view the criticality of having our private information.

The problem is not, of course, limited to Barbie. Everything from thermostats, to power meters, to televisions, to appliances are now reporting private information back to their manufacturers. Google's purchase of the Nest thermostat business ought to give us pause in this regard. Google is in the data business. They rightly perceive that thermostats are a means to that end. If you use Google maps, Google already knows what restaurants you frequent, what brand of gasoline you buy, and who your friends are. If you don't use Google maps, Google at least knows by the behavior of your thermostat whether you're home or away. They also know whether you're properly complying with their environmental preferences regarding the way you run your own home. If you use Gmail, Google knows every purchase you make online and, for many people, Google knows every time you withdraw money from the bank.

Now it may seem like I'm picking on Google but I'm really not. They just happen to be the most egregious collector of private data (along with Facebook and Amazon being a distant third.) But make no mistake, every connected device you own is going to be forwarding information about you to its manufacturer. Some of it will be of benefit or convenience to you, but we should be under no illusions about the potential for abuse.

Which gets me back to my original question: can any person be trusted with power over another.

America's founders held to the tragic, fallen view of man regarding his wielding of power. They believed that no man was trustworthy enough to wield unconstrained power. And even though they recognized the need to entrust the administration of justice to flawed human beings, they developed an elaborate system within which no one was given power who wasn't at the same time shackled by the countervailing power of another entity. And one might even argue that the entire jury system was devised as last ditch bulwark for the people to counteract the abuses of those entrusted with governmental authority.

Now, someone may suggest that we have less to worry about with these large companies having our data because, unlike the government, they do not have a legitimate claim to the exercise of lethal force. The problem with this line of thinking is that it presupposes a sharply distinct dividing line between the government we have and the interests of these companies. But both Google and Facebook are powerful enough to shape our perceptions of informational reality. They are able, if they so choose, to skew our understanding of candidates, suppress or expand voter turnout, and generally have an influence on electoral outcomes.

And it is no comfort to say that, well, these are American companies. They're actually not American in any elevated sense of the word. They are legally, or perhaps legalistically American. But in substance they are multi-national companies without clear allegiances outside of themselves. Their employee base is particularly international and they are without the kinds of concerns for America's welfare that people sort of just presume exists within American companies.  (As an aside, I have long thought that if I were al Qaeda or ISIS, I would be planting engineers inside these companies to gain access to the data they possess. Such information can be used for a great deal of mischief.  From wreaking economic havoc to tracking the physical location of stateside American military personnel, all manner of trouble could be made if a person with access to this data was so inclined.)

There isn't some vast conspiracy related to the collection of data. Technology has just evolved to the point at which doing what people are doing is practically feasible. The technology is just outpacing the moral considerations and, sad to say, many of the people building the technology are not operating within a Judeo-Christian moral context.

One of the tactical things that bothers me about the vast Hoovering up of so much data is the lack of disclosure and visibility we each have to what's being reported. Companies are largely operating in the shadows with their data collection, politicians understand little about what's going on, and you and I have data flying out of our private lives that we don't even know about. It seems to me that, more than anything, we need ways of shining the light on what is going on in the shadows.

Technology is just a tool.  For harmful use there is usually a corresponding technology which can provide some sort of self-defense. Lots more to think about.

Written by

Keith Lowery

Follower of Christ. Husband. Father. Grandfather. Maker. Consumer of Data. Reader of Books. Writer of Code.


Tags
TechnologyCulture