Entries Tagged 'Security'

Fraud Wasn't Even the Worst Part

By Timothy R Butler | Posted at 9:45 PM

People know I love Apple, but not everything is wonderful dwelling in the realms created by Cupertino. Here’s my adventures in dealing with the aftermath of fraud on Apple Card:

By now most of us know the unpleasant drill. The credit card company calls or texts you and says there appears to be an unauthorized purchase. Somehow, that happened (near as I can tell, merely coincidentally) on three different accounts for me within a week in July. One has turned into a continuing pain months later: Apple Card. Some of this is a fault of the card, but the greater fault lies in a weak bit of design in Apple’s platforms I otherwise love.

52 Verses, 52 Books, 52 Weeks (Week 40: 2 Chronicles)

By Timothy R Butler | Posted at 7:30 PM

This week, I turn to 2 Chronicles to think about what we learn from Kings David and Solomon on where we should put our confidence.

Zoom is Past Three Strikes...

By Timothy R Butler | Posted at 4:14 PM

Here’s more motivation to consider Microsoft Teams, Skype, Apple FaceTime, Facebook Messenger, etc., in lieu of Zoom. Days after the company was caught for a second time within a year using the same tactics as malware to install its software on computers, and days after it turned out it was leaking recorded calls online, it also admits to routing calls “accidentally” and insecurely through China. Facebook isn’t the epitome of privacy and security, but Facebook Messenger is end-to-end encrypted; Zoom is not.

Zack Whittaker for TechCrunch:

Hours after security researchers at Citizen Lab reported that some Zoom calls were routed through China, the video conferencing platform has offered an apology and a partial explanation.

To recap, Zoom has faced a barrage of headlines this week over its security policies and privacy practices, as hundreds of millions forced to work from home during the coronavirus pandemic still need to communicate with each other.

Limiting the All Writs Act

By Timothy R Butler | Posted at 5:31 AM

A very encouraging ruling today in New York concerning the All Writs Act and the government's desire to force Apple to sabotage its security model:

“Apple is not doing anything to keep law enforcement agents from conducting their investigation. Apple has not conspired with [the defendant] to make the data on his device inaccessible,'' the judge wrote. “The government's complaint is precisely that Apple is doing nothing at all.”

The judge also offered an opinion, which I believe is correct, on why the government would try to accomplish this through the courts rather than through new legislation:

“It is also clear that the government has made the considered decision that it is better off securing such crypto-legislative authority from the courts…rather than taking the chance that open legislative debate might produce a result less to its liking,” he wrote.

I fear legislation could easily pass in our current political climate that values security more than liberty, but it would at least be more challenging than trying to move this through the courts further away from the spotlight.

Why Anyone Who Loves Freedom Needs to Support Apple

By Timothy R Butler | Posted at 6:52 PM

I tweeted this article about the severe implications of the government's request for a backdoor in Apple's products and included the comment “Imagine the 1st missionary killed in a hostile land, found via an FBI mandated backdoor. This is why Apple is right.” A friend of mine asked me on Facebook why it is so crucial Apple not be forced to create a system that would allow the unlocking of the San Bernardino terror suspect's phone. I want to answer my friend's question by exploring two different parts of the problem.

To understand where this all starts, it starts with Apple creating an encryption system that they did not have the key to unlock. After the revelations about the NSA that Edward Snowden released, Apple created such a system for a very simple reason: it became clear that the government intended to vastly exceed its constitutional surveillance powers and the only way a company like Apple could avoid becoming a collaborator was to remove itself from the key equation so that it genuinely could not access customer data. If a company has the key, the government can demand the key not only to see what a terrorist has on his or her phone, but also for other, less desirable searches like the warrantless, broad data collection the NSA has been doing over the last decade. Worse, when the government utilizes these unconstitutional powers, it imposes gag orders on the companies it interacts with so they cannot even say anything about what is happening.

It bears repeating: while there is broad support for breaking into a terrorist's phone, the only way Apple can legally avoid being made a tool for the government against all of us, not just terrorists, is to create a product that does not have a backdoor. So, Apple did the logical thing: it created a product without any backdoors. Apple is now being asked not just to “unlock” its phone, but to create a new version of its software that has an intentionally broken security system. If it exists, even if it were installed on only this one phone, we will be only a few secret FISA orders away from it being installed on thousands or millions of phones. If an iOS variant that creates a vulnerability exists, the NSA can just contact Apple six months from now and order that same backdoor be included in every iOS device the next time a software update goes out. And, it could gag Apple so that the company could not warn anyone.