Apple is getting set to release its latest iPhone. The iPhone 12 successor will include three major new camera and video recording features. This after apple announced new controversial protections against explicit images in its messages, app and iCloud photo. Libraries are mark German here with more so look mark. Let's start on the new camera and video recording features. What do we know? So? We know there are three main new features coming.
The first is called cinema mode. If you're familiar with portrait mode, that's existed on the iPhone since the seven plus five years ago. You can take a picture and the object or subject in the foreground will be sharp, and the background will be blurred, and this is known in the photography world, as the bouquet effect now they're going to be bringing this to video. It's an it's a common technique used in high-end video on high-end DSLR, and now this will be coming to the iPhones uh for the first time when the new models are announced in just several weeks from now. The second feature is recording in what is known as pro res is a much higher fidelity.
A higher quality format uh used in the video industry. It makes editing a lot more capable uh. You know and interesting for video editors. The third feature is a new AI based photo filter system that can adjust colors and highlights and shadows in pictures on individual elements of a photo, whether that's an object or a person, so quite a bit more capable than your standard Instagram like filter system. So let's talk a little about uh the controversy surrounding apple's new software targeted at these explicit images.
There seems to be continuing concern about it. A lot of people don't understand it they're just reading the headline, don't understand how it's actually going to work. What's your take on the sort of ensuing reaction that we've seen here yeah, I think the reaction is somewhat fair. I think there's some misunderstanding out there about how the functionality works. Apple is not actually viewing the images in your iCloud photo library.
What they're doing is they're scanning or analyzing the whole library, and they're assigning what are known as hash keys, using a process called cryptography to each image and those hash keys correspond to your individual images. Then there's a database operated by nick MC. This is an institute against child exploitation, child abuse against child pornography, and they have a database of videos and photos that they've collected from law enforcement agencies over the years and what apple is doing, is they're assigning those hash keys to those images in that database as well, and so they're scanning your library and comparing the hash keys in your library to the hash keys in that database to see if there are matches and then what apple does. If there is a certain amount of matches in your photo library and, unfortunately, they're not telling us how many matches that is, they will then have human reviewers review those images, and if there is indeed a confirmed match, then it will notify nick MC and then further law enforcement. What apple should be doing is they should be providing a better explanation as to what's in this database who manages the database and how images get into that database.
There's a lot of concern about that. I'm not going to sit here and defend. You know apple's efforts against child exploitation and those explicit images, I don't think apple- is doing anything wrong from wanting to prevent those images. What I will say is: it is slightly hypocritical to some people who say that you know apple refused to give up. You know to break into iPhones belonging to those behind terrorist attacks such as with San Bernardino, yet they are willing to do this against child pornography in photos.
Obviously, both are bad, but some people are arguing that apple should be working with law enforcement on both subjects or neither certainly much to debate there. Meantime you've also got a story out about how google is limiting ad targeting to teenagers. This is something that Instagram Facebook's Instagram did a while ago. What can you tell us about this yeah, so we're seeing a lot of these companies' apple? Obviously, Facebook with Instagram, like you, said, and google really limiting ad targeting in some form, obviously earlier this year with the launch of app tracking transparency or ATT apple, limited, app targeting and ad tracking, across websites and apps across you know, the entire iOS ecosystem, regardless of age Instagram, is doing it for 18 and under now, in relation to some demographics and such and interests. And now google is doing that as well, specifically across all of their services, but with the focus on YouTube as well and there's a number of other new protocols they're putting in place to protect kids, they are stopping autoplay.
So if you have kids watching YouTube, videos won't continue to play. You know in succession, you know, keep kids all night watching video.
Source : Bloomberg Technology