May I point you at the Chewbacca defense. No-one has mentioned privacy on the internet it is completely unrelated. Why not bring up privacy walking down the street too? That's as irrelevant. This is a discussion of a personal device recording things it doesn't need to, and then making them available counter to the whole point of the built in encryption which is supposed to protect the owner. It is about the fact that the data tracked is accessible outside the protections for no good reason. Its about the lack of encryption when the phone is locked. Basically, I point you back at my first post and the individual points I made.How things actually are is that you're dreaming if you think there is privacy on the internet. There hasn't been for a LONG time.
Your arguments on this topic are very very weak. I could trot out the thousands of vulnerabilities in closed source software as a counter but it would be pointless.
Let's put it this way. Attackers find and use vulnerabilities, whether they're documented or not. That's well known. If we document those vulnerabilities, we can protect against them. Documentation aids security. It doesn't reduce is.
And there about as many in open source. Let me improve my argument with a bit of empirical evidence showing that both closed and open source are pretty much equally vulnerable.Your arguments on this topic are very very weak. I could trot out the thousands of vulnerabilities in closed source software as a counter but it would be pointless.
The worst thing though is to have good documentation and an open code base and having no one reviewing and updating that code regularly. That is a recipe for disaster as the documentation is just a gift to hackers (like Heartbleed) and there are quite a few open source projects that suffer from this sort of neglect (no one reviewing or updating code regularly or properly).
Last edited by seawolf; 23rd July 2014 at 12:55 PM.
And that is not an argument against open source or against avoiding security through obscurity. It is an argument for better project management and better code auditing. There's plenty of closed source software that suffers from similar neglect. It doesn't mean security through obscurity is a good thing.The worst thing though is to have good documentation and an open code base and having no one reviewing and updating that code regularly. That is a recipe for disaster as the documentation is just a gift to hackers (like Heartbleed) and there are quite a few open source projects that suffer from this sort of neglect (no one reviewing or updating code regularly or properly).
My later comment about obscurity combined with great security frameworks being the ideal means that if you could take the community review of open source and combine it with the obscurity (and money for developers and testers) of closed source you'd have the ideal security. Disagree? It's impossible yes, but it would be the ideal.
Anyway, I'm done with it. Edugeek is a minefield of bias. Bye.
So, how did that out in the open thing work out for OpenSSL? Heartbleed anyone? Obscurity plus good security frameworks are the ideal.
You can't have obscurity and open source, its simply not possible by its definition. Therefore, the other option is closed source. Its a simple piece of logic based on your own words.
I want Apple to document all the bits of the phone, and what they're doing. Including the bits that track what you're doing and record data onto your device unencrypted. You've consistently ignored the actual points raised in my first post.All I asked was if you preferred Apple to document how to hack the iPhone. As in document all of the back doors and intentional vulnerabilities in iOS so that anyone and their brother could use them. That's what I said. Check it. Then see who jumped to the conclusion here.
Your later comment is an extension of your comment I quoted above - if you are pro obscurity, using simple logic, you have to be anti-open source. You can't have an obscured open source project.My later comment about obscurity combined with great security frameworks being the ideal means that if you could take the community review of open source and combine it with the obscurity (and money for developers and testers) of closed source you'd have the ideal security. Disagree? It's impossible yes, but it would be the ideal.
What is with people who simply can't present good arguments stating this lately? Its happened in 3 or so threads when people fail to state their case properly and then get upset when called on it.Anyway, I'm done with it. Edugeek is a minefield of bias. Bye.
Your posts are consistent in being pro-Apple. Even in the face of industry standards, evidence and simple facts. It isn't bias to question Apple. It is bias to skip over points raised and start using arguments unrelated to the actual discussion, in order to distract from the original issue.
So does this mean China knew something we are just finding out. Seawolf says that both Apple and MS are closed security and it's both these companies that have been banned from Government use in China. Makes me wonder if the reason they gave was different from the real reason.
Also, with so many people using and editing the Android code base and Google offering the code online would it not be much harder to hide something like this?
When you have to have physical access to the device and for it to be unlocked, it's not really a bomb is it?
Security expert rejects Apple, NSA, iOS backdoor claims « ComputerWorld
The Apple backdoor that wasn't « ZDNetSecurity researcher/hacker Jonathan Zdziarski (aka. "NerveGas") made the claims at the HOPE/X hacker conference, saying these "undocumented" services could be used by law enforcement. Typically, his story quickly became a cause célèbre among those who seek to damage Apple's robust reputation for security.
Apple swiftly rejected Zdziarski's accusations, pointing out that end users are in complete control of the claimed hacking process -- the person owning the device must have unlocked it and "agreed to trust another computer before the computer is able" to access the diagnostic data the claimed NerveGas attack focuses on.
In other words the NerveGas attack is a non-story. It's hot air.
Last weekend, a hacker who's been campaigning to make a point about Apple security by playing fast and loose with the now widely-accepted definition of "backdoor" struck gold when journalists didn't do their homework and erroneously reported a diagnostic mechanism as a nefarious, malfeasant, secret opening to their private data.
Speaking at the Hackers On Planet Earth conference in New York, Jonathan Zdziarski said that Apple’s iOS contains intentionally created access that could be used by governments to spy on iPhone and iPad users to access a user's address book, photos, voicemail and any accounts configured on the device.
As he has been doing since the Snowden documents started making headlines last year, Mr. Zdziarski re-cast Apple's developer diagnostics kit in a new narrative, turning a tool that could probably gain from better user security implementation into a sinister "backdoor."
The "Apple installed backdoors on millions of devices" story is still making headlines, despite the fact that respected security researchers started debunking researcher Jonathan Zdziarski's claims the minute people started tweeting about his HopeX talk on Sunday.Regardless of the problems with Mr. Zdziarski's sermon, the (incorrect) assertion that Apple installed backdoors for law enforcement access was breathlessly reported this week by The Guardian, Forbes, Times of India, The Register, Ars Technica, MacRumors, Cult of Mac, Apple Insider, InformationWeek, Read Write Web, Daily Mail and many more (including ZDNet).
People were told to essentially freak out over iPhones allowing people who know the passcode and pairing information to use the device.
Nothing to hide, nothing to fear.
This is gonna happen forever, even if they say its not happening.
If people want complete privacy, throw your laptop and phone into the river, delete as many online accounts as you can, and then go and live in the Scottish Highlands under a new name.
Hand written letters are permitted.
More the cry of common sense.
I am accepting that monitoring will occur. And that if you are not doing anything wrong under UK law you have nothing to fear.
The cry of the stupid comes when something happens and those against snooping complain that nothing was done to stop an incident.
We work towards a just and fair society, but also a society which is secure.
Any person can yield to corruption. That will always be the case. That is why more than one person is involved. Absolute power corrupts absolute so they say. Absolute power is not something that is common place in these sort of things these days.
Last edited by Alkaline; 28th July 2014 at 10:46 PM.
There are currently 1 users browsing this thread. (0 members and 1 guests)