Modern economics has little room for parasites. In the vast majority of models, there are only buyers and sellers — there’s no one who just comes up and steals your money. In the real world, of course, there are parasites galore — thieves, con artists, fraudsters, extortionists and more. In the long term, the amount of parasitism in any system should depend on the cost of policing — if it’s easy for thieves to steal, there will be more theft.
On the internet, parasites are rampant. Email spam, identity theft and cyberespionage are some of best-known examples. And, of course, there was the October 21 denial-of-service attack that made many prominent websites inaccessible.
Every year, billions of dollars are spent on cleaning the system of these bloodsuckers. That spending might add to gross domestic product, but in economic terms it’s social waste — in an ideal world, we wouldn’t have to use resources to stop parasites.
The most recent threat is ransomware. Basically, if you open the wrong file, a programme steals the contents of your hard drive and seals it away behind a wall of encryption. Only if you cough up the ransom will the thieves give you the password to get your data back.
In 2011, I was reading about ransomware in science fiction novels. This year, it might cost consumers as much as $1 billion (Dh3.67 billion). And the amount of money spent on protecting against this menace will probably be much higher than the amount actually stolen, since enough needs to be spent to keep the danger rare and the internet usable.
So far, the people who protect us from cybercrime have managed to stay one step ahead of the bad guys. But this could change with the advent of machine learning.
Most cybersecurity failures happen at the human level. A person clicks on the wrong email attachment or the wrong browser download, and malware can bypass the best cybersecurity software. That’s because despite all the advances in protective software, we still give human users the final say in what they do with their technology.
But machine-learning — in particular, a recently popular technique called deep learning — changes that calculus. Just as deep learning managed to beat the best Go player in the world, artificial intelligence (AI) software available to thieves can learn to trick us out of our money — pretending to be a relative over the phone and gain access to your online accounts.
When machines can outsmart humans, the only defence will be better machines.
That may require us to give up control over our own internet activities, because the more we’re allowed to make our own decisions, the more likely we are to be subverted by hyperintelligent software.
That could decrease the value of the internet as a thing that humans want to use for pleasure and work. It’s as if criminals suddenly got the technology to teleport past locked doors and enter any home they want.
A home would become a less enjoyable place to live, either because we’d have all our stuff stolen or because we’d have to spend enormous amounts of time and effort guarding it all the time.
Improvements in cybercrime technology could thus decrease the incredible consumer surplus that we get from the internet. Currently, consumers derive enormous free benefits from the online world — internet access is very cheap, even though most people would probably pay a lot of money to avoid getting kicked off of their smartphones, social networks, web browsers and email accounts. Essentially, we’re getting a free lunch.
In terms of economic and scientific models, the free lunch of consumer surplus can last forever. But in the real world, parasites enter the equation. It might be that networks with very large consumer surpluses — such as the internet — tend to get parasitised until most of the consumer surplus goes away.
That’s a possibility modern economics ignores, but it could be real. So far, we’ve been lucky. Spam filters and security measures have kept the internet almost free, while letting us use it easily and without much danger.
Machine-learning in the wrong hands could change that equation. We might see soaring cybersecurity costs, decreased freedom of use or rampant danger from artificially intelligent criminals. Technology started this party, but if we’re unlucky, technology might end it.
— Washington Post
Noah Smith is an assistant professor of Finance at Stony Brook University and a freelance writer for finance and business publications.