Steve Wozniak, co-founder of Apple, says that he’s “with Apple on this one.” The longtime coder, activist, thinker and philanthropist weighed in on the contentious court case during an interview on Conan, and was firm in his support of Apple’s position.
The FBI, he says, picked the worst possible case on which to push the tech industry: one where the likelihood of finding information is low, but the likelihood of potentially creating longterm security risks for Apple customers is high.
“The two phones owned by the people that aren’t even convicted terrorists didn’t have one link to a terrorist organization (…) so they wanna take this other phone, that they, the two didn’t destroy, that was a work phone…that’s so lame and worthless to expect something’s on it.”
Apple’s concern, which Tim Cook ably outlined in an open letter, is that creating a backdoor for the FBI in this case would a) set a dangerous legal precedent, and b) weaken Apple security overall. Importantly, Cook points out that smartphones contain a tremendous amount of data on users’ lives: not just what they search for on the internet, but who they talk to and for how long, what their sleeping, eating and exercise patterns are, and where they travel. Apps passively and actively collect data on all aspects of users’ life. Your photos contain meta data of where and when they were taken. Your health apps have information about your body that no one but you and your doctor need to know. Cook says,
“The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.
“The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”
Although public and political support for the FBI’s position has been strong, both Wozniak and Cook point out that what is clearly precedent setting is a government agency asking a tech company to deliberately build in customer data insecurity. It’s this aspect of the issue that is so important for tech and business leaders to understand. Like last year’s Ashley Madison and US Office of Personnel Management hacks, this case raises important questions about a) how secure our data really is, b) what incentives exist for tech companies, HR departments and government to ensure data stays secure, and c) how we can balance personal security and privacy with the needs of criminal investigators.
HR departments sit on a wealth of employee data, both personal and financial, that has value for criminals and law enforcement agencies alike. Like smartphones, HR databases and your work computer contain more information than you might initially think – not just your social security number and banking details, but little details about your working and private life that, trust me, you want to keep private. And although absolute data security is impossible, it is incumbent on both tech and HR to ensure the best possible data security under current conditions. As an industry, HR can’t work towards data security if the most fundamental tools of the workplace are insecure.
Watch Wozniak’s full comments on Conan here: