I was with you until the end, where you start pretending that Apple's guidelines are capable of preventing malware. Take a look at the kind of stuff they reject apps for and tell me this is a process that has anything to do with security.
I mean, remember just a few days ago, when a 15-year-old released a tethering app right under Apple's nose? Remember a year ago when everyone was outraged because an iPhone app covertly uploaded your whole address book to the developer's server? This Android app is arguably less of a violation than that one. They only check the visible behavior of your program and whether you're using private API. As long as you don't throw up a dialog that says "DURRR, I'M STEALIN UR INFO NOW MAN," Apple isn't really looking for you.
It wasn't a true "teathering" app. It was just a flashlight app that also opened up a SOCKS proxy within the app. Its very possibly that if the App also had network code for loading ads, say through iAds or AdMob, that an API scanner wouldn't notice anything unusual for a reviewer to feel concerned about.
Apple did find an app that exploited an App Store feature that you only need to type in your iTunes password once for a period of 5 minutes -- allowing you to DL multiple apps with minimum hassle. An app used its in-app content purchase to bill buyers for $200 worth on content it purchases without the user's explicit consent.
Apple may not be looking out for you in general, but they looking out for their brand, which is tied to consumer experience.
Its very possibly that if the App also had network code for loading ads, say through iAds or AdMob, that an API scanner wouldn't notice anything unusual for a reviewer to feel concerned about.
That's precisely the point. They can't catch underhanded code in the app unless they do a full source code audit - and honestly, I wouldn't be the least bit surprised if Apple changed the SDK license to require developers to upload their .xcodeproj files which get built and released from Apple's servers directly.
I'm not sure even a full code-audit is practical. Some big vendors use libraries under licenses that don't allow them to distribute the source to third parties (I don't know how prevalent this is on iPhone, but I know it happens in general), and Apple has shown an aversion to rules that mess with the big guns this way. It would also require a much higher caliber of app testers, because finding malicious code in a large codebase is nontrivial even if the malware writer isn't being particularly clever.
This would basically require Apple to severely choke the App Store pipeline, and given Apple's relatively lax attitude toward security, I don't think they'd make that tradeoff.
I mean, remember just a few days ago, when a 15-year-old released a tethering app right under Apple's nose? Remember a year ago when everyone was outraged because an iPhone app covertly uploaded your whole address book to the developer's server? This Android app is arguably less of a violation than that one. They only check the visible behavior of your program and whether you're using private API. As long as you don't throw up a dialog that says "DURRR, I'M STEALIN UR INFO NOW MAN," Apple isn't really looking for you.