Apple Responds By iupdate

By iupdate
Aug 14, 2021
0 Comments
Apple Responds

Hey guys, what's up it's Sam here back again for another video and I wanted to do um a part two. I didn't think we would be getting more information about apple's CSAM system so quickly, but I felt it was my responsibility to give you the latest information about the system. That apple is, let me just start off with the first piece of information that they're still fully planning to implement in iOS 15 watch OS 8 everywhere later this year, so even after all, of the concern apple has said that they're doing it, they are fully planning to implement this. They have clearly made this decision and there have been a lot of people that have rallied around them during this time and there have a lot of uh people, which is somebody like me, others in the tech space. Well, some others in the tech space that have said yeah. We totally agree with less CSAM, which is child sexual abuse material.

We all fully support less of that. Nobody wants that. It doesn't deserve a place in the world in any way. The method of Apple doing this and the personal invasiveness of this feature is what's concerning us and there's a new interview that came out today in TechCrunch. That unfortunately confirms some of my worst fears about this and um.

I also just wanted to your know, clarify some things, get some things straight and do the best job. I could is somebody covering this to give you the latest information about the system. So the first thing is yes, this is still happening even in the days in the aftermath of apple's super flubbed announcement. This is still occurring apple, fully plans to go through with this later in 2021, rather than breaking down every single part of the TechCrunch interview, because some of it is repetitive for things we already knew. I just read through it and made notes, wrote down the things that were most relevant to me and I, I hope, they'll be most relevant to you as well.

The person Matthew pandering, who, I believe, is the editor in crunch. Sorry, I'm not supposed to be laughing editor-in-chief at TechCrunch interviewed apple's head of privacy, so this is not a low-level exact. This is the lead of the privacy team at all apple globally. His name is Eric neuenschwander. I believe, that's how you pronounce it and we have a really, really good q, a I think, Matthew killed.

It first questions he asks is why now, if Google's been doing this since 2008 Facebook, since around 2013, why is apple doing this in the middle of august? This is the announcement. Is it in response to some things that are happening in the EU legally alluding to, essentially, is there some legal pressure from apple to do this now apple says that they finally got the technology they needed to balance user privacy because they're doing the scanning on device and then matching it with this database server side. You know sort of double encryption, and then they want to protect kits. So they said that's why now we figured out the way that we wanted to do it on our own terms, and we're very passionate about this. So that's.

Why now that actually sounds relatively realistic to me um. I believe that that's. Why maybe there's another reason, but there is no objective evidence that there's another reason. Well, you know we that's the best guess we have is what apple's telling us straight up as to. Why is that they had this technology, and they felt like it was ready for the world next question.

Following that, I'm going to be reading directly from the article just because uh I wanted. I want to do an exact read, so it says: does this creating a framework to allow scanning and matching of on-device content create a framework for outside law enforcement? To counter with? We can give you a list. We don't want to look at all the users' data, but we can give you a list of the content. Furthermore, we'd like you to match, and if you can match it with this content, you can match it for other content. Furthermore, we want it to search for how does it not undermine apple's current position of hey? Furthermore, we can't decrypt the user's device it's encrypted.

Furthermore, we don't hold the key because Apple has long said: hey, we don't know how to get into end encryption. That is when Apple doesn't know how to break the encryption. There is other encryption that is simply encrypted and decrypted that apple can get to so, for example, a device backup of your iPhone synced to music or iTunes on your computer can be end-to-end encrypted, meaning that no one at all can get into it. There is no backdoor an iCloud backup has a backdoor if you are suspected of a crime or there is. You know substantial evidence, for you have done something wrong and the US government can request access to your data and Apple will hand them the key to your iCloud backup or your iCloud messages.

If you don't use iCloud at all, and you're only using computer syncing, then it is ended and encrypted and just to bring this up. You know before we go into the answer for this uh. There was a report that apple was planning to do end-to-end encryption for iCloud backups and then decided not to after pressure from the Federal Bureau of Investigation. Super disappointing, uh hope this report's not true, but considering it's from Reuters, pretty much points to it being accurate, so Apple was working to make things more secure and essentially back down, based on this reporting due to a government entity kindly requesting that they do so, which has led a lot of people to be skeptical of this move, which apple will also be reporting to law enforcement agencies when they find sun. So to reiterate the question, it was this: if you can get a list of CSAM to match, couldn't the police come to apple and say: hey match this certain content? Here's what apple says it doesn't change that one iota the device is still encrypted.

We still don't hold the key, which is partially true, and the system is designed to function on device data. What we've designed has a device side component, and it has a device side component by the way for privacy improvements, the alternative of just processing by going through and trying to evaluate users. Data on a server is actually more amenable to changes without user knowledge and less protective of user privacy. This is one of the most concerning lines in the entire article, because it confirms that this is jeopardizing privacy in some way, they're saying well, yeah. We don't really like hold the key, and I mean if we, if we just did it server side, you know there's been a lot of flocks for us doing it on device, because that's creepy, I'm paraphrasing, of course here uh we're doing it on device, because it's more secure, it's more private and the servers are they're more likely to changes that the hashes could change the database of things to look for.

Could change, but more is what they're saying not is, and ours is not that one is more susceptible to changes, saying that this could change that the hashes' apple's looking for, even though they're loaded into iOS, 15.0, and they're a certain way right now, it is possible that the hashes could be changed, that your photos could be scanned to look for something else. That is what I am reading here and while that is not possible right now in the current system. This doesn't do that. What I am hearing here and based on my interpretation is that it is possible that a bad actor could change that or Apple could face pressure in some areas to change that. What is apple's response to that? This is the one of the best questions it says you know you have to adhere to laws in places like china.

So what? If a government wants you to search for certain hashes in the future? This is exactly what I've been alluding to and many others from the beginning apple says: well first, this is launching in only the U. S. iCloud accounts, so the hypothetical, the apple loves to say this when they're not comfortable, answering something well, it's just a hypothetical hypothetical seem to bring up generic countries or other countries that aren't the US when they speak in that way, and therefore it seems to be the case that people agree. U. S.

law doesn't offer these kinds of capabilities to our government right now they don't, but that could change in the future. This is what people are worried about. This is the central thesis. This is at the heart of this right. Now it's all good, but this is cracking open the door.

This is cracking open, the door, and sometimes that is all it takes. They continue apple says, and so the hypothetical requires jumping over a lot of hoops, including having apple change, its internal process to refer material that is not illegal like known CSAM, and that we don't believe that there's a basis on which people will be able to make that request in the US. So to be crystal clear in layman's terms here apple says we'll just say no to the government. If the government comes and asks us to change the way the system works, we just won't and Apple has given us absolutely no reason to not believe them right now. But if you look in places specifically, if you look at china, Apple has changed the way that they operate in that country.

Because of the government and that's what has people concerned, they have done it before when a government has said hey, we need you to change this thing to work for us, because you need to sell in this market and while that hasn't happened in the US, yet this is what scares us is that it is possible and there's now a way for it to happen easier than ever before. Just wanted to make this crystal clear, because this is the crux of this video. This is why I've made multiple videos about it. This is why everybody should care now moving on to this next part. A very important clarification that I want to make crystal clear is that you can opt out.

I remember that was a bit murky in my first video, and I wasn't clear because I didn't know for sure apple has now said on the record multiple times that you can completely never activate the scanning by following one step, and that is not using iCloud photo library. That's good for people that want to your know, take care of their privacy. You won't be connected to Apple's cloud. The confusing part, the mind-boggling part, is that the whole point is to catch predators, and if you can disable it then won't the people. Do you see what I'm saying? I brought this up on the latest episode of my podcast with john prosper.

We talk all about this. If you guys want to hear it I'll leave a link down below. While I am happy that you can opt out, that's a huge loophole, number one and number two. It means you can't use Apple's cloud and services are becoming a bigger and bigger revenue source for apple every year. So it's basically like there 's's no way to opt out because it's so built into the phone.

You know what I'm saying you get: five gigs for free everybody, syncs their photos to the cloud in some way. Its tough people have criticized apple, uh, saying well. The on-device scan is almost a bit weirder than doing the server side. Why don't you just do the server-side CSAM detection like Facebook and Google have done it for years now? In some cases, over 10 years apple says that they claim they're raising the bar. I'm going to paraphrase here that there was an industry standard, and they're changing it, that the processing on the server side was putting more data at risk than on device and by apple syncing, the hashes manually to the phone and the phone doing the scanning.

You know they claim that the server side is super easy and that this was much more complex and thought about. I don't know they're doing something very similar, and yes, this is the most private way. Apple could do this, but it still is jeopardizing privacy. The final question on here is about. You know the threshold right because Apple's mentioned this threshold for detection and that, like you know, a picture of your naked kid won't be detected, because this is only things that match an existing database.

No new photos are going to be marked here. It's only if you have known sun material, so they've said threshold um, Matthew, Mandarin, says well, isn't one piece of Clem too much basically and apple, says that they want to make sure that they hit this one and one trillion false report rate false positive, and they said they've done that by setting a threshold, they don't say what it is, but it does beg the question of what makes three pieces of CSAM worse than one it's illegal content. It doesn't deserve a place anywhere, I mean, can you have eight pieces and not be detected by apple still, and these are stored on their servers in iCloud? Obviously, Apple doesn't report that number, because then you would know how to game the system. I mean they basically just say yeah I mean if it hits the threshold we're going to report you to the national center for missing and exploited children and law enforcement will be able to take up and effectively investigate, prosecuting convict. That last part was a direct quote, so they mean they say time and time again.

We don't want to look through users photo libraries. We have no interest in doing that. That is not what this is about. This is about matching a pattern of hashes here that matches a pattern of hashes here, a better example for music there's a fingerprint of music here. If you use that same fingerprint, even if it's changed a little, YouTube can still detect it because the raw fingerprint parts of it are the same, and if you use copyrighted music in video, you can't monetize it apple's saying we don't want to look through your photos, we're not trying to find anything else.

All we're going to do is see if this matches any parts of your images, and it's a very specific thing. We are going to find it, and while that sounds like a good idea, as I would like to reiterate, as we close out this second video, it sounds like a great idea, and we all support that, but it is objectively reducing how private your iPhone is. I would love to talk to somebody from apple about this if they would like to come on my channel or my podcast. So I don't know if you work at apple, but yeah that'd be cool um. This is just some updates.

I had for you guys, so I'll, let you know if anything else happens, but right now this is still fully planned and whiplash clarified some things uh, it's broadly a confirmation of things that we feared. Thank you guys for watching um hope you have a good day and uh got a good fun. Video coming back tomorrow, see you guys next one peace.


Source : iupdate

Phones In This Article


Related Articles

Comments are disabled

Our Newsletter

Phasellus eleifend sapien felis, at sollicitudin arcu semper mattis. Mauris quis mi quis ipsum tristique lobortis. Nulla vitae est blandit rutrum.
Menu