A legal fight between Apple and the FBI is highlighting some critically important issues in the debate between privacy and security.

After the San Bernardino attacks, an iPhone 5C belonging to one of the two shooters, Syed Rizwan Farook, was among the evidence gathered by the FBI. The FBI obtained a warrant to search the contents of the iPhone, but Farook’s iPhone, like most people’s, is protected by a passcode that encrypts the data on the phone and prevents anyone without the code from accessing it.

Apple’s security systems are designed to prevent hacking attempts like “brute-force attacks” by requiring delays after wrong passcode guesses and an auto-delete function that is activated after ten incorrect attempts. The FBI went to court to obtain an order to get Apple to help them get around these security features and access the data on Farook’s phone.

Court orders Apple to help the FBI; Apple refuses to comply

On Tuesday, Magistrate Judge Sheri Pym of the Federal District Court for the District of Central California issued an order requiring Apple to provide “reasonable technical assistance” to help the FBI unlock Farook’s iPhone, including specifically instructing Apple to allow them to “bypass or erase the auto-erase function” and the required delay between passcode attempts.

Judge Pym’s order included instructions for Apple to object to the order within five business days if they believed that “compliance with this Order would be unreasonably burdensome,” and that’s exactly what Apple did.

Apple CEO Tim Cook posted a statement on Apple’s website stating their objection to what he described as the government’s “unprecedented step which threatens the security of our customers.”

Calling smartphones like the Apple iPhone an “essential part of our lives” that “store an incredible amount of personal information,” Cook wrote that Apple was “deeply committed” to safeguarding customer data:

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

Describing what the FBI wants as a “backdoor” to the iPhone, Cook argues that they intentionally created the iPhone with safeguards to protect customers’ data, and that this backdoor would be “too dangerous to create.”

Should we really trust the government with this power?

That’s the question posed by National Review’s Kevin Williamson. “You know what would be better than prosecuting those who helped the San Bernardino jihadists?” wrote Williamson. The painfully obvious answer: “Stopping them.”

An arranged marriage to a Pakistani woman who spent years doing . . . something . . . in Saudi Arabia? Those two murderous misfits had more red flags on them than Bernie Sanders’s front yard on May Day, and the best minds in American law enforcement and intelligence did precisely squat to stop their rampage. Having failed to do its job, the federal government now seeks even more power — the power to compel Apple to write code rendering the security measures in its products useless — as a reward for its failure…

From the IRS to the ATF to the DEA to Hillary Rodham Clinton’s super-secret toilet e-mail server, the federal government has shown, time and again, that it cannot be trusted with any combination of power and sensitive information. Its usual range of official motion traces an arc from indifference through incompetence to malice.

Where the federal government imagines that it gets the power to order a private firm to write software to do its incompetent minions’ jobs for them is anybody’s guess. Tim Cook and Apple are right to raise the corporate middle finger to this nonsense.

What do you think? Should Apple fight the court order or should they help the FBI access the data on Farook’s iPhone? In this fight between privacy and national security, where should the limits be?

Follow Sarah Rumpf on Twitter @rumpfshaker.