IEyeNews

iLocal News Archives

USA: Debate likely to continue into 2016 on companies providing info to law enforcement

Negotiation Street Signs
Negotiation Street Signs

By Ed Silverstein, From Legaltech News

It is likely legislators will continue to debate proposals requiring companies to provide info to law enforcement and intelligence officials – if suspected terrorism is involved.
As members of Congress look to 2016, it is likely legislators will continue to debate proposals that would require tech and social media companies to provide information to law enforcement and intelligence officials – if suspected terrorism is involved.
One bill was introduced earlier this month by Sen. Dianne Feinstein (D-California) and Sen. Richard Burr (R-North Carolina) that would make tech companies quickly report content related to the planning of an attack, recruitment efforts or the release of terrorist-related documents.
“We’re in a new age where terrorist groups like ISIL are using social media to reinvent how they recruit and plot attacks,” Feinstein said in a statement. “That information can be the key to identifying and stopping terrorist recruitment or a terrorist attack, but we need help from technology companies. This bill doesn’t require companies to take any additional actions to discover terrorist activity, it merely requires them to report such activity to law enforcement when they come across it.”
“Social media is one part of a large puzzle that law enforcement and intelligence officials must piece together to prevent future attacks,” Burr added. “It’s critical that Congress works together to ensure that law enforcement and intelligence officials have the tools available to keep Americans safe.”
But the proposal is not uniformly supported.
For instance, Vivek Krishnamurthy, a faculty member at the cyberlaw clinic at Harvard Law School, called the bill “hopelessly vague.”
After all, he asked, what is the definition of “terrorist activity?”
“It’s not going to be effective,” he said about the proposal. According to Krishnamurthy, the legislation may be an effort by Congress to be “seen as doing something.”
“Neither the reporting process nor the scope of ‘apparent terrorist activity’ is defined in the bill, which could lead Internet companies to massively over-report their users’ personal information and communications to the government, just to ensure the company stays on the right side of the law,” said Emma Llanso, director of the Free Expression Project at the Center for Democracy and Technology. “This is an obvious threat to individual privacy and would have a significant chilling effect on people’s willingness to discuss important and contentious political, religious and ideological matters.”
“The bill also fails to provide any safeguards for the information that would be shared with the government under this proposal, and there is no apparent recourse for individuals who are erroneously reported to the government as possibly associated with terrorist activity,” she added.
Another issue is the amount of social media content that is reviewed by an actual person at a large social media company, such as Twitter. It is “miniscule,” when compared to all of the content being generated, Krishnamurthy said.
He explained that there is a possibility that there could be some kind of machine learning that could review all social media by looking for certain terms in a word-search database. However, there is the risk that it would pick up a lot of content that may mention certain suspect words, though in reality the data may not be linked to terrorists.
Sen. Ron Wyden (D-Oregon) said in a statement, “Social media companies must continue to do everything they can to quickly remove terrorist content and report it to law enforcement.”
He also said he is opposed to the Feinstein-Burr bill. “I believe it will undermine … collaboration and lead to less reporting of terrorist activity, not more. It would create a perverse incentive for companies to avoid looking for terrorist content on their own networks, because if they saw something and failed to report it they would be breaking the law, but if they stuck their heads in the sand and avoided looking for terrorist content they would be absolved of responsibility.”
“If law enforcement agencies decide that terrorist content is not being identified quickly enough, then the solution should be to give those agencies more resources and personnel so they know where to look for terrorist content online and who to watch, and can ensure terrorist activity is quickly reported and acted upon,” Wyden added.
Recently, there were reports that terrorists involved in the Paris attacks may have used encrypted apps in the planning of the attacks.
The report has led to renewed efforts to provide law enforcement officers access to encrypted communications if terrorism is involved.
But Apple CEO Tim Cook recently said in a 60 Minutes interview that users of something like the iPhone may have on their phones content on health information, financial information, intimate conversations and business secrets.
“You should have the ability to protect it,” Cook said. He explained that if there is a way in, somebody will be able to find a way in. “Back doors” mean both the good guys and the bad guys can get in. Instead, if the government provides Apple a proper warrant, the company will legally provide investigators with what they want. Cook said that positioning the conflict in this area as one between privacy and national security is being “overly simplistic” and Americans “should have both.”
Krishnamurthy said that “encryption is fundamental to the way the Internet works. Communications are going to be encrypted. That’s a good thing …. You can never have perfect surveillance.”
IMAGE: iQoncept
For more on this story go to: http://www.legaltechnews.com/id=1202745445207/Debate-Likely-to-Continue-Into-2016-on-Companies-Providing-Info-to-Law-Enforcement#ixzz3v3yIXttI

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *