Veteran software developer David A. Kruger offered some thoughts on computer security recently at Expensivity and we appreciate the opportunity to republish them here as a series. Yesterday, we looked at how online human data collectors get free from legal responsibility. Today we look at how the current system punishes small businesses for data breaches that they could not have prevented.
A Poke in the Eye
Furthermore, in the domain of unintended consequences, deterrence polices are based on the technological symptomatic point solution fallacy. Businesses are assumed to be negligent if they have a data breach. That’s correct in some cases, but in others, businesses, particularly small and medium-sized businesses, suffer increased compliance costs or have been bankrupted by data breaches that they had no ability to prevent. Basing deterrence policy and penalizing businesses on the mistaken belief that symptomatic point solutions can reliably prevent data breaches makes about as much sense as fining the pedestrian injured in a hit and run because they failed to jump out of the way.
It is wrong to punish businesses for harms caused by software makers that the business has no way to prevent. Punishing a business for data breaches provably caused by their own negligence is appropriate; punishing them for software makers’ negligence is not. Policymakers should distinguish between the two to prevent punishing the victim.
Possession is Nine-Tenths of the Law
The term “raw material” as applied to human data in this article is meant literally. Human data is “raw” at the point of collection. Raw human data has intrinsic economic value, but after it’s further processed by DHCs, its refined value is much higher. Think of an individual’s raw human data as you would crude oil, gold ore, or pine trees. Each has intrinsic economic value; they can’t be taken from a landowner by oil, mining, or lumber companies without the agreement of the landowner. Raw human data is material because as explained earlier, data is as physical as a brick—it’s just quantum small.
Wikipedia says that the saying “possession is nine-tenths of the law” means “ownership is easier to maintain if one has possession of something, or difficult to enforce if one does not.” The legal concept of ownership is predicated on an individual’s practical ability to control a physical thing’s use. Control enables possession, and possession codified in law confers legal ownership.
In law, possession can be actual or constructive. Actual possession means the thing is under your sole control. Constructive possession means a third party is permitted to use the thing, but your legal ownership is maintained, and usage is controlled by means of a contract. A simple example is a home that you own as your primary residence (actual possession) and a house that you own but lease to others (constructive possession). Constructive possession is especially relevant to data because data is usually shared by making a copy and transmitting it, not sending the original. Since a data owner would likely retain the original, it’s more appropriate to see shared data as having been leased, not sold.
Key Point: Controllable data enables constructive possession of data when it’s leased it to others, and it enables software to objectively enforce both sides of the lease.
Key Point: If users legally own the raw human data that their own digital activities create, it’s reasonable for policymakers to assert that fiduciary duties apply to those who collect it; they are being entrusted with an asset they don’t own.
Next: The Easy Button
Here are the first ten segments in the series:
The true cause of cybersecurity failure and how to fix it Hint: The cause and fix are not what you think. David A. Kruger, a member of the Forbes Technology Council, says it’s getting worse: We’re in a hole so stop digging! Get back to root cause analysis.
What’s wrong with cybersecurity technology? Know your enemy: The target isn’t networks, computers, or users; they are pathways to the target —gaining control of data. The challenge: If a cyberdefender scores 1,000,000 and a cyberattacker scores 1, the cyberattacker wins, David Kruger points out.
Ingredients that cybersecurity needs to actually work Software makers continue to produce open data as if we were still living in the 50s, and the Internet had never been invented. Forbes Council’s David Kruger says, the goal should be safety (preventing harm) rather than, as so often now, security (reacting to hacks with new defenses).
Cybersecurity: Put a lid on the risks. We already own the lid. Security specialist David Kruger says, data must be contained when it is in storage and transit and controlled when it is in use. Cyberattackers are not the problem; sloppy methods are. We must solve the problem we created one piece of data or software at a time.
The sweet science of agile software development Effective security, as opposed to partial security, increases costs in the short run but decreases them in the long run. Software veteran: Getting makers to change their priorities to safer products safe rather than the next cool new feature will by no means be easy.
Computer safety expert: Start helping ruin cybercriminals’ lives. Okay, their businesses. Unfortunately, part of the problem is the design of programs, written with the best of intentions… First, we must confront the fact that software makers are not often held responsible for the built-in flaws of their systems.
The cybercriminal isn’t necessarily who you think… Chances are, the “human data collector” is just someone who works for a company that makes money collecting data about you. Did you know that his bosses have paid gazillions in fines for what he and his fellows do? Let’s learn more about what they are up to.
Sometimes, money really is the explanation. Today’s internet is a concentration of power, in terms of information, never before seen in history. The HDCs (human data collectors) treat us as guinea pigs in a thoroughly unethical experiment designed to learn how to manipulate the user most effectively.
How search engine results can be distorted Search providers such as Google are able to increase their ad revenues by distorting the search results delivered to users. Human data collectors (HDCs) have been able to evade responsibility for the preventable harms they cause by blame shifting and transferring risk to users.
How online human data collectors get free from responsibility Cybersecurity expert David A. Kruger talks about the Brave Old World in which you have much less power than Big Tech does. For Big Tech, government fines and other censures are merely a cost of doing business, which makes reform difficult at best.