Welcome back to This Week in Apps, the weekly TechCrunch series that recaps the latest mobile OS news, mobile applications, and the overall app economy.
The app industry continues to grow, with a record 218 billion downloads and $143 billion in global consumer spending in 2020. Consumers last year also spent 3.5 trillion minutes using apps on Android devices alone. And in the U.S., app usage surged ahead of the time spent watching live TV. Currently, the average American watches 3.7 hours of live TV per day but now spends four hours per day on their mobile devices.
Apps aren’t just a way to pass idle hours — they’re also a big business. In 2019, mobile-first companies had a combined $544 billion valuation, 6.5x higher than those without a mobile focus. In 2020, investors poured $73 billion in capital into mobile companies — a figure that’s up 27% year-over-year.
This Week in Apps offers a way to keep up with this fast-moving industry in one place, with the latest from the world of apps, including news, updates, startup fundings, mergers and acquisitions, and suggestions about new apps and games to try, too.
Do you want This Week in Apps in your inbox every Saturday? Sign up here: techcrunch.com/newsletters
Apple to scan for CSAM imagery.
Apple announced a significant initiative to scan devices for CSAM imagery. The company on Thursday announced a new set of features, arriving later this year, that will detect child sexual abuse material (CSAM) in its cloud and report it to law enforcement. Companies like Dropbox, Google, and Microsoft already scan for CSAM in their cloud services, but Apple had allowed users to encrypt their data before it reached iCloud. Now, Apple’s new technology, NeuralHash, will run on users’ devices; platforms detect when users uploads have known CSAM imagery without decrypting the images. It even can see the imagery if it’s been cropped or edited in an attempt to avoid detection.
Meanwhile, on iPhone and iPad, the company will roll out protections to Messages app users to filter images and alert children and parents if sexually explicit photos are sent to or from a child’s account. Children will not be shown the photos but will instead see a grayed-out image. If they try to view the image anyway through the link, they’ll be shown interruptive screens that explain why the material may be harmful and are warned that their parents will be notified.
Some privacy advocates pushed back at the idea of such a system, believing it could expand to end-to-end encrypted photos, lead to false positives, or set the stage for more on-device government surveillance in the future. But many cryptology experts believe the system Apple developed provides a good balance between privacy and utility and have offered their endorsement of the technology. In addition, Apple said reports are manually reviewed before being sent to the National Center for Missing and Exploited Children (NCMEC).
The changes may also benefit iOS developers who deal in user photos and uploads, as predators will no longer store CSAM imagery on iOS devices in the first place, given the new risk of detection.