Princeton academic puts forward model for greater accountability.
Princeton academic Edward Felten has urged the information security and wider technology community to engage with intelligence agencies and governments in a constructive way to drive better privacy outcomes, even if it requires active participation in intelligence activities.
Felten, the director of the Centre for Information Technology Policy at Princeton University, provided attendees at his keynote address at the AusCERT conference this morning a five-step plan for restoring trust in computing in the wake of Edward Snowden’s expose of the NSA’s online espionage activities.
The professor argued that NSA overreach – driven by the idea that the US and its allies are safer if it “collects as much info as legally and technically possible”, even if that means building exploits and backdoors into systems and standards – had created an environment of “pervasive insecurity”.
He agreed with fellow speaker Felix Lindner (head of Recurity Labs) that a drying up of public information about vulnerabilities, again in the name of ‘national security’, was a dangerous trend.
The current response from the information security community – to simply throw more kit at protecting its perimeter – was no longer effective.
Felten gave five key strategies to restoring trust in security:
Strategy #1 – Take trust seriously
Felten used HTTPS as an example of where ‘trust’ is mislaid on the internet. The cascading chain of certificates upon which browsers choose to place trust is based on the idea that “somebody made a list of 100 entities trusted by everyone in the world”.
“There is no entity, certainly not 100, that everyone in the world truly trusts,” Felten argued. “But that’s the basis on which this system is built.”
The number of forged SSL certificates Carnegie Mellon researchers found in the wild [pdf] proved the idea of trust in this model needs to be revised, he said.
The technology that underpins HTTPS – crypto – is effective, he said, but the system suffers from “horrible institution design.
Strategy #2 – Force adversary to target
Felten argued that – assuming users consider the NSA or indeed Australia’s intelligence community to be “adversaries” – the IT security community needs to make it harder for these agencies to hoover up communications en masse.
“We need to exploit the scale problem an all-seeing adversary has,” he said.
He advocated the use of simple puzzle encryption when sending messages. This would encrypt a message such that it can only be unlocked by the recipient after solving a hash puzzle.
A deliberate actor would only need to devote half a CPU second of time to solve the puzzle, but the more communications secured in such a way makes it computationally more expensive for intelligence agencies to hoover up all communications, and thus more attractive to restrict surveillance to legitimate targets.
“It’s not as good as end-to-end encryption of our messages, of course, but in many real world scenarios it is either impractical or we simply can’t encrypt communications in this way.”
Felten urged IT administrators not to make any promises to users when employing such simple techniques, but to simply offer it “as a matter of course, silently, as a defence against non-targeted surveillance”.
Strategy #3 – Improve agencies’ systems
Felten’s most controversial advice was for the industry to engage with governments and actively participate in intelligence gathering activities, in order for civil liberties and privacy to be better considered.
He advocated for a recommendation made by the US President’s Review of signals intelligence in the wake of the NSA scandal, endorsed in principle by President Obama’s response, that co-opts telecommunications companies into the intelligence community’s data retention programs.
“We recommend that legislation should be enacted that terminates the storage of bulk telephony metadata by the government under section 215, and transitions as soon as reasonably possible to a system in which such metadata is held instead either by private providers or by a private third party. Access to such data should be permitted only with a section 215 order from the Foreign Intelligence Surveillance Court that meets the requirements set forth in Recommendation 1.”
– Recommendation No. 5, ‘Liberty and Security in a Changing World’ [pdf]
While it is “certainly not the ideal solution”, Felten said there were pros to having telcos and hosts act as a network of data custodians, with that data available via a query interface to intelligence analysts under strict conditions.
“One of the advantages of retaining data in the telecom providers is that it offers better visibility to the public, who can better influence providers on how long they keep the data.”
From a computer science perspective, an intelligence system should be designed to optimise performance, cost and reliability, but Felten added a fourth optimisation currently missing in the intelligence community’s systems today – oversight.
The system should ideally avoid the replication and aggregation of data and be designed with accountability in mind, he said.
A distributed network of telcos would best meet these design principles, he said.
Opponents of telco data retention – including the other keynote speaker at AusCERT’s first day, Recurity Labs’ Lindner – argued that so long as data retention is a “cost centre” for telcos, it will be the least protected data they hold.
”If data retention becomes a cost centre, maybe [the telcos] will in the very least join the discussion as to whether these intelligence activities happen at all,” Felten retorted.
Strategy #4 – Changing the debate
Felten said the technology community needs to shift the debate – which today talks of a trade-off between security and privacy in the name of national security – to a discussion about accountability and transparency.
Technologies need to be developed to provide the same accountability in the online world that a simple warrant provides in the physical world, he said.
Felten’s students and Microsoft Research have both been working on ways to implement court-ordered access to an individual’s information in a secure way.
“You want to make sure that information is released only if there is a valid warrant or court order and yet at the same time there is the need to keep that court order secret so the targets are not tipped off,” Felton told SC in the lead-up to his keynote.
“We have researched how to use cryptography to get all of the desired properties so that there is no access to information without a valid court order, and yet the court order is cryptographically sealed so that the target cannot see it.”
By using encryption in this way, he said, computer science can be used to combine goals that would otherwise be impossible to achieve at the same time.
The Microsoft Research model argues that telecommunication providers should encrypt every customer or transaction record with a random key, and send the encrypted database to the intelligence community.
It is only when a court publishes a request for specific data that the parties engage in a secure multi-party computation, under which the system verifies that warrant matches the requested account ID the intelligence agency is asking for, and provides the agency a decryption key for the requested account ID.
Felten’s team is looking to improve on this model in the name of accountability, narrowing the range of communication and suggesting more robust encryption that would be harder for intelligence agencies to break if an agent can’t get a warrant.
#Strategy 5 – More public engagement
Felten said these ideas need to be taken to governments – even if that means IT security staff “donning their suits” and travelling to the capital to meet with lawmakers, agencies and regulators to build political support for more accountability.
“That opportunity is starting to open,” he said.
He was buoyed by US director of national intelligence James Clapper’s reflection that offering more transparency about the NSA’s programs from the beginning would have reduced the impact of the Snowden leaks.
“This is the window opening for us as security professionals and citizens to help governments recognise the public interest we are advocating for.”