chikere232 3 days ago

> You are Apple. You want to make search work like magic in the Photos app, so the user can find all their “dog” pictures with ease.

What if you're a user and you don't care about searching for "dog" in your own photos, you might not even use the Photos app, but apple still scans all your photos and sends data off device without asking you?

Perhaps this complicated dance works, perhaps they have made no mistakes, perhaps no one hacks or coerces the relay host providers... they could still have just asked for consent the first time you open Photos (if you ever do) before starting the scan.

  • zombot 3 days ago

    Exactly, I don't want my shit sent all across the internet without my explicit prior consent, period. No amount of explanation can erase Apple's fuck-up.

    • GeekyBear 3 days ago

      Apple does photo recognition on your device.

      Google, on the other hand, uploads photos to their server and does the analysis there.

      There is the infamous case of the parents who Google tried to have arrested after they used their Android device to seek medical assistance for their child during lockdown. Their doctor asked them to send images of the problem, and Google called the police and reported the parents for kiddie porn.

      > “I knew that these companies were watching and that privacy is not what we would hope it to be,” Mark said. “But I haven’t done anything wrong.”

      The police agreed. Google did not.

      https://www.nytimes.com/2022/08/21/technology/google-surveil...

      Google refused to return access to his account even after the police cleared him of wrongdoing.

      • jchw 3 days ago

        Google's reputation with privacy advocates is absolutely horrible, but that shouldn't have anything to do with Apple's practices. Comparing Apple and Google will indeed tell you a lot of interesting things, but that's not what this is about.

      • derefr 3 days ago

        Kind of feels like it should be a crime for a private party to attempt to write terms into a contract that introduce punishments justified in terms of law, where that justification is based on an interpretation of law that's already been legally proven to the author of the contract to be a misinterpretation of said law.

        It's sort of the crime of "contempt of court", but after the fact: receiving a judge's prescription about how you must interpret a law during a case, but then going right back to using a different interpretation when you leave court.

      • throw10920 3 days ago

        > Google refused to return access to his account even after the police cleared him of wrongdoing.

        This is why I constantly work to help people reduce their dependence on Google. Screw that. If anyone ever tells you that they rely on Google for anything, show them this article.

        • brokenmachine 2 days ago

          I don't want to set up my own email server.

          But I definitely live in fear of Google fucking up and disabling my account.

      • diggan 2 days ago

        > Apple does photo recognition on your device.

        > Google, on the other hand, uploads photos to their server and does the analysis there.

        The comment you're replying to (and the whole sub-thread in fact) isn't about if how Apple is doing it is the best/worst way, but rather before they do it, they don't ask for permission. Regardless of how they technically do it, the fact that they don't ask beforehand is what is being argued about here.

        • x0xrx 7 hours ago

          It seems strange to demand they ask “permission” in this instance but not e.g. to let you sort you photos by date, or album, or location.

          (I agree this is the point in contention, I just don’t understand it).

      • lern_too_spel 3 days ago

        Google doesn't send your pictures to their servers without your explicit consent. This is exactly what users expect. On Android, you can use your own self-hosted photos server and have it work exactly the same way Google Photos does. Google Photos does not have access to private Google-only APIs like Apple Photos has on iOS.

        • GeekyBear 3 days ago

          > Google doesn't send your pictures to their servers without your explicit consent.

          The parents Google tried to get arrested in the story above do not agree.

          > When Mark’s and Cassio’s photos were automatically uploaded from their phones to Google’s servers, this technology flagged them. Jon Callas of the E.F.F. called the scanning intrusive, saying a family photo album on someone’s personal device should be a “private sphere.” (A Google spokeswoman said the company scans only when an “affirmative action” is taken by a user; that includes when the user’s phone backs up photos to the company’s cloud.)

          Google not only automatically uploaded their images to their server, it analyzed those images and reported the users to the police for kiddie porn based on a single false positive.

          • brokenmachine 2 days ago

            When I first ran Google Photos on my Android phone, it asked me if I want to enable automatic backup to Google. There were definitely some dark patterns there, but it was easy and obvious how to opt out.

            If you care about not sending photos to Google, it's pretty obvious how to not have that happen.

            IMO, Google is not the bad guy here, although when it was explained to them that the photos were legitimate, they should definitely have reenabled the account.

            I'm OK with Google scanning photos that I send to them that will be stored on their servers. Honestly, how can they not?

          • johnisgood 3 days ago

            > user’s phone backs up photos to the company’s cloud.

            I never enable cloud backups, because it means my shit is sent somewhere.

            • GeekyBear 3 days ago

              You don't have to enable it, since Google backs up your photos to their servers by default.

              Then they proceed to claim those automatic backups are an "affirmative action" that justifies them scanning the contents of your images as well.

              • fauigerzigerk 2 days ago

                >You don't have to enable it, since Google backs up your photos to their servers by default.

                When setting up a new phone (and many times thereafter) they prompt you to enable photos backup. It's not on by default if I remember correctly.

                • blincoln 17 hours ago

                  If you don't enable backup, Google Photos randomly reprompts on a regular basis with a "sure would be a shame if something happened to your photos!" modal. It's very easy to accidentally turn it on without noticing when this happens, if one has preemptively clicked where they expect a different UI element to be.

                • johnisgood 2 days ago

                  It is not enabled by default if we trust it. In fact, I am not even logged in to Google.

              • brokenmachine 2 days ago

                It's not on by default. It asks when you first add your account to Android.

              • lern_too_spel 2 days ago

                > Google backs up your photos to their servers by default.

                You keep saying that, but it remains false. The parents explicitly opted in to sending their photos to Google.

        • colanderman 3 days ago

          Google Photos "consent" is one of the worst dark patterns I regularly encounter.

          About weekly it prompts me with a huge popup whether I want to continue without backup, with "enable backup" selected by default. If I deselect this I'm prompted with another popup asking me to back up specific selected photos. If I misclick either of these (which is easy, since they pop up after briefly showing my photos which I'm actively trying to tap on), then Google will start hoovering up all my photos without confirmation.

          Their "consent" form is user-hostile and it's disingenuous to hold it as an example of Google protecting privacy.

          Pro tip: install Google Gallery which (ironically) is effectively a de-Googled Photos. Unfortunately it's also stripped down in other ways but it suffices for simply viewing photos on your own device.

    • Klonoar 3 days ago

      They are not sending your actual photo, as has been covered at length on numerous threads on this very site.

      • gigel82 3 days ago

        That's irrelevant if the information they do send is sufficient to deduce "Eiffel tower" or "dog" out of it: that's too much information to send.

        • GeekyBear 3 days ago

          They don't have to send anything since they do all the image recognition on the user's own device.

          Sending everything to a server is, however, how Google's service works.

          • gigel82 3 days ago

            No they don't, the whole reason for Homomorphic encryption is sending stuff out of your device.

            You don't need any encryption to process locally.

            • GeekyBear 3 days ago

              The erroneous claim was that Apple handles image recognition and image search on servers, the way Google does.

              Apple handles those tasks on the users own device(s) for privacy reasons.

    • api 3 days ago

      Not wrong, but it’s interesting that Apple gets so much flak for this when Google and Microsoft don’t even try. If anything they try to invade privacy as much as possible.

      Of course maybe that question has its own answer. Apple markets itself as the last personal computing company where you are the customer not the product so they are held to a higher standard.

      What they should do Is do the processing locally while the phone is plugged in, and just tell people they need a better phone for that feature if it’s too slow. Or do it on their Mac if they own one while that is plugged in.

      • j2kun 3 days ago

        FWIW, I work on homomorphic encryption at Google, and Google has all kinds of other (non-FHE) privacy enhancing tech, such as differential privacy, federated learning, and https://github.com/google/private-join-and-compute which are deployed at scale.

        Perhaps it's not as visible because Google hasn't defaulted to opt-in for most of these? Or because a lot of it is B2B and Google-internal (e.g., a differential-privacy layer on top of SQL for internal metrics)

        [edit]: this link was a very vague press release that doesn't say exactly how Google uses it: https://security.googleblog.com/2019/06/helping-organization...

        • keeganpoppen 3 days ago

          uhhh yeah it's not visible because it's not used for anything. because it runs contrary to Google's entire raison d'être. if it's not turned on by default, what is even the point of doing it at all other than to pacify engineers who are perfectly happy to miss the forest for the trees? it's kind of like saying that you have the power of invisibility, but it only works if no one is looking at you.

      • BlackFly 3 days ago

        Well when you are building a feature that can only be appreciated by a subculture of people (privacy advocates), and they complain about the most basic faux pas that you could do in their culture (not asking them before you phone home with data derived from their data)... you have invited these people to criticise you.

        Most people I know of wouldn't care about such a feature other than a breathless sort of "Wow, Apple tech!" So they are building something which is intended to win over privacy conscious people, kudos to them, everyone stands to benefit. But the most basic custom in that subculture is consent. So they built something really great and then clumsily failed on the easiest detail because it is so meaningless to everyone except that target audience. To that audience, they don't bother criticising google or microsoft (again) because it goes without saying that those companies are terrible, it doesn't need to be said again.

        • ylk 3 days ago

          > a feature that can only be appreciated by a subculture of people (privacy advocates)

          Just because it can’t be “appreciated” by all users doesn’t mean it’s only “for” a small sub-group.

          It seems to me they’re just trying to minimise the data they have access to — similar to private cloud compute — while keeping up with the features competitors provide in a less privacy-respecting way. Them not asking for permission makes it even more obvious to me that it’s not built for any small super privacy-conscious group of people but the vast majority of their customers instead.

        • gigel82 3 days ago

          "not asking them before you phone home with data" is a basic faux pas for privacy advocates? LOL; that's a fundamental breach of trust of the highest degree, not basic by any means.

          • Dylan16807 3 days ago

            Are you under the impression that "basic" and "fundamental" are not synonyms?

      • lapcat 3 days ago

        > just tell people they need a better phone for that feature if it’s too slow. Or do it on their Mac if they own one while that is plugged in.

        The issue isn't slowness. Uploading photo library data/metadata is likely always slower than on-device processing. Apparently the issue in this case is that the world locations database is too large to be stored locally.

        • phkahler 3 days ago

          >> Apparently the issue in this case is that the world locations database is too large to be stored locally.

          What kind of capacity can ROM chips have these days? And at what cost?

      • yard2010 3 days ago

        In other words: don't hate the player hate game, but the point still stands.

        • drawkward 3 days ago

          The game, unlike Apple's policy, is opt-in. Hate the player and the game.

      • okamiueru 3 days ago

        Whataboutisms aren't all the great you know. Google and MS also get flak, and they also deserve it.

        But now that we're talking about these differences, I'd say that Apple users are notoriously complacent and defend Apple and their practices. So, perhaps in some part it is in an attempt to compensate for that? I'm still surprised how we've now accepted that Apple receives information pretty much every time we run a process (or rather, if it ran more than 12 hours ago, or has been changed).

      • victorbjorklund 3 days ago

        You can always find someone worse. Does not mean we should not critise people/organizations.

        You think Trump is bad? Well, Putin is worse. You think Putin is bad? Kim Jong Un is worse.

        • beretguy 3 days ago

          And who's worse than kim?

          • camjw 3 days ago

            Kier Starmer, if you ask Elon

    • butlike 3 days ago

      Doesn't Photos.app on iOS sync with iCloud OOTB?

      • louthy 3 days ago

        Optionally, yes

        • jimt1234 3 days ago

          And it nags the hell outta you if you opt out.

  • prophesi 2 days ago

    A quick shoutout to Ente Photos[0]. FOSS with an opt-in locally-run semantic search of your photos. The first encoding with a ton of photos may take a few minutes in the background, but after that it takes no time with subsequent photo uploads. I'm not sure why Apple is going through the trouble of uploading the photos and incorporating homomorphic encryption for something this simple, particularly with their push for local AI and their Neural Engine[2].

    I also appreciate Ente's Guest View[1] that lets you select the photos you want to show someone in person on your phone to prevent the issue of them scrolling too far.

    [0] https://github.com/ente-io/ente

    [1] https://ente.io/blog/guest-view/

    [2] https://en.wikipedia.org/wiki/Neural_Engine

  • tempworkac 3 days ago

    It doesn't really matter if they ask you or not, ultimately you have to trust them, and if you don't trust Apple, why would you even use an iPhone?

    • lapcat 3 days ago

      Trust is never all or nothing. I trust Apple to an extent, but trust needs to be earned and maintained. I trust my mom, but if she suggested installing video cameras in my home for my "safety", or worse, she secretly installed video cameras in my home, then she would lose my trust.

      Likewise, you need to trust your spouse or significant other, but if there are obvious signs of cheating, then you need to be suspicious.

      An essential part of trust is not overstepping boundaries. In this case, I believe that Apple did overstep. If someone demands that you trust them blindly and unconditionally, that's actually a sign you shouldn't trust them.

      • sbuk 3 days ago

        > If someone demands that you trust them blindly and unconditionally, that's actually a sign you shouldn't trust them.

        That's certainly a take, which you're clearly entitled to take. I don't disagree with the point that you make; this ought to have been opt in.

        What you should do now is acknowledge this in your original post and then explain why they should have been more careful about how they released this feature. Homomorphic encryption of the data reframes what you wrote somewhat. Even though data is being sent back, Apple never knows what the data is.

        • lapcat 3 days ago

          > What you should do now is acknowledge this in your original post and then explain why they should have been more careful about how they released this feature. Homomorphic encryption of the data reframes what you wrote somewhat.

          Do you mean my original blog post? The one that not only mentions homomorphic encryption but also links to Apple's own blog post about it? I don't know how that can "reframe" what I wrote when it already framed it.

          • sbuk 3 days ago

            I apologise, I didn't fully read your original article as I find that your writing is prone to exaggeration. I've reread it a few times now and I stand by what I said. You mention homomorphic encryption only in a quoted piece of text and a link. You utterly fail to explain what it is. You didn't frame it at all. You hand-waived at it. I don't disagree with you on the point about this being opt in, but your blog post is a massive overreaction, heavy on prose and opinion, but light on any tangible facts.

            • lapcat 3 days ago

              > I apologise

              Wow, that's some apology. Everything after those words is an insult.

    • razemio 3 days ago

      How can you trust any mainstream "working" iPhone or Android device? You already mentioned open source android distros. You mean those where no banking or streaming device app works because you have to use a replacement for gapps and the root / open bootloader prevents any form of DRM? That is not really an option for most people. I would love to have a Linux phone even with terrible user experience as long as I do not lose touch with society. That however seems to be an impossible task.

      • warkdarrior 3 days ago

        You don't trust Apple's and Google's mobile phones. And some bank doesn't trust open source android distros on mobile phones. Those are both fine positions. You are free to move to another bank, just like the bank is free to not accept you as a customer.

      • tempworkac 3 days ago

        I'm curious what functions other than maybe depositing a check requires a banking app?

        • tredre3 2 days ago

          When I'm in Canada I often transfer money (interac e-transfers). I always use the website, even on mobile, but the website has some arbitrary limits than the app does not. For example I can only transfer $1,000 at a time, the app allows $10,000. There's also a limit of recipients per day.

          My charitable interpretation is that the app allows a greater verification process so the bank trusts it more and it's "to protect me, the user". But then, the website lets me transfer $100,000 using a multitude of other methods if I want (wire, e-check, create carrier check), so... yeah.

        • bitdivision 3 days ago

          Depends where you live. In the US, probably not much, but in other countries where transfers are ubiquitous, being unable to use a banking app could be a real problem.

          • tempworkac 3 days ago

            are there really countries where the bank doesn't have a website you can use to do a transfer, but you could do it through an app?

            • bitdivision 3 days ago

              I don't know, though certainly the experience is a lot simpler without the 15 minute timeout, painful login, and extra security checks I see on web banking.

              Edit: Not to mention that many of the newer banks don't even have web banking. It's app only. Of course, its your choice to open an account there though

            • solarkraft 3 days ago

              In Germany and I think the whole EU 2 factor authentication is mandatory, for which the favored implementation is an app. SMS TAN is out, the alternative is a secondary device you stick your card into.

              • Dylan16807 3 days ago

                Do you need a proprietary app for that? TOTP is fine, you can just pick your own.

                • TeMPOraL 3 days ago

                  Haven't seen a bank offering software TOTP in Poland. Over a decade ago, before smartphones became ubiquitous, I've seen a bank offering a physical TOTP device. These days, as far as I've seen, it's either SMS codes or single use codes on a physical scratch cards (haven't seen one in 5 years, though), or in-app confirmation.

                • solarkraft 3 days ago

                  Yes and they tend to be shoddily programmed security theater. My bank makes me use SecureGo plus, which goes as far as redirecting you to a website telling you screenshots aren’t allowed when you try to document its latest glitch (which may be another misguided “security” feature, who knows).

            • razemio 2 days ago

              In Germany you can use the website BUT you still need the app for 2fa. SMS is no longer an option for most banks, because it is considered insecure. Same goes for TOTP since this can easily be replicated, if you have access to the device generating the TOTP.

            • sbuk 3 days ago

              No, but there are bank accounts that are app only. Monzo in the UK is a popular example.

        • Eavolution 2 days ago

          Bank transfers, online purchases (most banks reqire 3DS now and usually won't let you buy things online without the app on a phone), some don't have a web interface, and others if they do require you to approve the login to that from the app

    • chikere232 3 days ago

      As they didn't ask, I will trust them less

      • tempworkac 3 days ago

        why use a device by someone you don't trust? honestly don't get it. I'd use an open source android distro

        • chikere232 3 days ago

          It doesn't have to be binary. I have some trust for apple. They've earned it in various ways by caring for privacy.

          When they start opting me into photo scanning I lose a bit of trust. The homomorphic encryption makes it less bad. The relative quiet around the rollout of the feature makes it worse. Apple's past attempt to start client side scanning makes it worse. Etc...

          The net result is I trust them a bit less. Perhaps not enough to set my apple devices on fire yet, but a bit.

        • drawkward 3 days ago

          I am merely a data scientist, so don't really know a ton about mainline programming beyond a few intro CS courses.

          Why would an open source android distro be more trustworthy?

          • subjectsigma 3 days ago

            Here is my simplified take on it which will likely get me flamed.

            Trust has many meanings but for this discussion we’ll consider privacy and security. As in, I trust my phone to not do something malicious as a result of outside influence, and I trust it to not leak data that I don’t want other people to know.

            Open source software is not inherently more secure nor more private. However it can sometimes be more secure (because more people are helping find bugs, because that specific project prioritizes security, etc.) and is usually more private. Why? Because it (usually) isn’t controlled by a single central entity, which means there is (usually) no incentive to collect user data.

            In reality it’s all kind of a mess and means nothing. There’s tons of bugs in open source software, and projects like Audacity prove they sometimes violate user privacy. HN-type people consider open source software more secure and private because you can view the source code yourself, but I guarantee you they have not personally reviewed the source of all the software they use.

            If you want to use an open-source Android distro I think you would learn a lot. You don’t need to have a CS degree. However unless you made massive lifestyle changes in addition to changing your phone, I’m not confident it would meaningfully make you more secure or private.

            • drawkward 3 days ago

              It was a bit of a strawman question anyway; as someone who could review the source myself but wont (because the pain-to-utility threshold is way too high) I am then required to place my trust in some ad-hoc entity (the open-source community), that doesn't actually have a financial disincentive to make sure things aren't bad.

              I have other reasons, perhaps, to prefer open source stuff, but I am not ready to assume it is inherently more private or secure.

              • subjectsigma 3 days ago

                Sorry, I lost some context in the thread or something because I thought you were asking as someone who legitimately didn’t know what open source was. Which I thought was kind of weird for HN but didn’t put two and two together.

        • internetter 3 days ago

          To your point, you can’t even trust the software if the hardware is untrusted

  • voidUpdate 3 days ago

    Android does this too. I don't really want all my photos indexed like that, I just want a linear timeline of photos, but I cant turn off their "memories" thing or all the analysis they do to them

    • lucideer 3 days ago

      Android doesn't do this. Everything is opt-in.

      Granted they require you to opt-in in order for the photos app to be usable & if you go out of your way to avoid opting in they make your photo-browsing life miserable with prompts & notifications. But you do still have to opt-in.

      • alex7734 3 days ago

        Google loves doing this.

        If you dare turn off Play Protect for example, you will be asked to turn it on every time you install or update anything. Never mind that you said no the last thousand times it asked.

        • diggan 2 days ago

          > Google loves doing this.

          Tech companies love doing this. Apple does the same, so does Microsoft.

          If you know some choice isn't right for you (now or forever), the company is feeling extra beautiful today, and you're in luck, you'll get a "Do this now, or I'll remind you later" choice. But then sometimes they just decide that "This is how things are now".

          I've had this happen in every environment except Linux, where I get to shoot myself in the foot whenever I want, and sometimes a bit more.

      • Enginerrrd 3 days ago

        It says it's "opt in" but as someone who hasn't opted in, I still get the notifications and I can see a split second preview of all the stuff they're not supposed to have computed before it asks me to opt in. So there's DEFINITELY shenanigans ocurring.

      • nine_k 3 days ago

        A number of good third-party photo-browsing apps make it non-miserable, even if you never open Google Photos or even uninstall it.

        • lucideer 3 days ago

          I've seen a lot of people saying this generally but no specific recommendations.

          I've used Simple Gallery Pro before but it's not very good.

          Currently using Immich but that's not really a general photo app - it's got a narrow use case - so I still use the Google Photos app alongside it quite often.

          Specific alternative recommendations that aren't malware welcome.

          • dvngnt_ 3 days ago

            > I've used Simple Gallery Pro before but it's not very good.

            It's rock solid for me. you can browse folders move, copy, hide small edits. you can't search 'dog' which is a plus, it doesn't scan faces.

          • marmight 3 days ago

            It depends which features you need, but interestingly Google has another, lighter weight gallery app called Google Gallery that does not have any cloud features built in.

          • nine_k 3 days ago

            Simple Gallery Pro is what I use, and it seems fine to me. What do you think should be added to it, or altered? Just curious how other people see UX.

          • pertique 3 days ago

            I can't personally vouch for it as I'm still stuck in Google Photos and would prefer to self-host it, but Ente may interest you. Open source, end-to-end encrypted, self-host or cloud.

            • lucideer 3 days ago

              I'm really happy with Immich & not looking for a replacement. Evaluated it vs Ente in the past & went with it instead - as far as I could tell their apps have the same features & limitations (focus on remote backup & display rather than on local on-device photo management & basic markup/editing).

              If (like me) you don't need e2e I can highly recommend Immich for its use-case though.

          • ckae 3 days ago

            Fossify Gallery (on Fdroid or Google store) works quite nicely for me as a nice and simple photo viewer and management app.

        • Ghoelian 3 days ago

          > or even uninstall it

          Unfortunately google's camera app will only open google photos if you click the image preview after taking one. Just doesn't respect the default gallery app setting at all.

    • y04nn 3 days ago

      I don't think Android does that. It's only Google Photo and only if you upload them to the cloud, if you don't sync/upload them, you can't search them with specific terms.

    • AshamedCaptain 3 days ago

      Samsung at least does these "dog" cataloguing & searches entirely on-device, as trivially checked by disabling all network connectivity and taking a picture. It may ping home for several other reasons, though.

      • llm_nerd 3 days ago

        Apple also does the vast majority of photo categorization on device, and has for years over multiple major releases. Foods, drinks, many types of animals including specific breeds, OCRing all text on the image even when massively distorted, etc.

        This feature is some new "landmark" detection and it feels like it's a trial balloon or something as it simply makes zero sense unless what they are categorizing as landmarks is enormous. The example is always the Eiffel tower, but the data to identify most of the world's major landmarks is small relative to what the device can already detect, not to mention that such lookups don't even need photo identification and could instead (and actually already do and long have) use simple location data and nearby POIs for such metadata tagging.

        The landmarks thing is the beginning, but I feel like they want it to be much more detailed. Like every piece of art, model of car, etc, including as they change with new releases, etc.

      • TeMPOraL 3 days ago

        Does or doesn't. You can't really tell if and when it does any cataloguing; best I've managed to observe is that you can increase chances of it happening if you keep your phone plugged in to a charger for extended periods of time.

        That's the problem with all those implementations: no feedback of any kind. No list of recognized tags. No information of what is or is to be processed. No nothing. Just magic that doesn't work.

        • reaperman 3 days ago

          With embeddings, there might not be tags to display. Instead of labeling the photo with a tag of “dog”, it might just check whether the embedding of each photo is within some vector distance of the embedding of your search text.

          • TeMPOraL 3 days ago

            Yes and no. Embeddings can be used in both directions - if you can find images closest to some entries in a search text, you can also identify tokens or phrases closest in space to any image or cluster of images, and output that. It's a problem long solved in many different ways, including but not limited to e.g.:

            https://github.com/pythongosssss/ComfyUI-WD14-Tagger

            which uses specific models to generate proper booru tags out of any image you pass to it.

            More importantly, I know for sure they have this capability in practice, because if you tap the right way in the right app, when the Moon is in just the right phase, both Samsung Gallery and OneDrive Photos does (or in case of OneDrive, used to):

            - Provide occasional completions and suggestions for predefined categories, like "sunset" or "outwear" or "people", etc.;

            - Auto-tag photos with some subset of those (OneDrive, which also sometimes records it in metadata), or if you use "edit tag" options, suggest best fitting tags (Samsung);

            - Have a semi-random list of "Things" to choose from to categorize your photos, such as "Sunsets", "City", "Outdoors", "Room", etc. Google Photos does that one too.

            This shows they do maintain a list of correct and recommended classifications. They just choose to keep it hidden.

            With regards to face recognition, it's even worse. There's zero controls and zero information other than occasionally matched (and often mismatched) face under photo properties, that you can sometimes delete.

    • buran77 3 days ago

      The "memories" part can be trivially done locally and probably is, it's really just reading the picture's "date taken", so it's conceptually as easy as a "sort by date". My old Android with whatever Photos app came with it (not Google's) shows this despite being disconnected for so long.

      There's nothing stopping either Apple or Google from giving users an option to just disable these connected features, globally or per-app. Just allow a "no cloud services" toggle switch in the Photos app, get the warning that $FEATURES will stop working, and be done.

      I know why Google isn't doing this, they're definitely monetizing every bit of that analyzed content. Not really sure about Apple though, might be that they consider their setup with HE as being on par with no cloud connectivity privacy wise.

      • voidUpdate 3 days ago

        "memories" constantly given me notifications about "similar shots" at random, so I'm assuming it is trying to analyse the content of the photos. I managed to disable the notifications, but not the actual analysis

      • Someone 3 days ago

        > The "memories" part can be trivially done locally and probably is, it's really just reading the picture's "date taken", so it's conceptually as easy as a "sort by date".

        It’s more. It also can create memories “trip to New York in 2020”, “Cityscapes in New York over the years”, or “Peter over the years” (with Peter being a person added to Photos)

    • Aachen 3 days ago

      No Android phone I've ever owned automatically uploaded your photos without asking. What exactly do you mean that it does too?

    • ranguna 3 days ago

      Uninstall Google photos and install a dumb photos app. I think most android phones don't even come with Google photos pre installed.

      • TheSpiceIsLife 3 days ago

        Dumb Photo App by Nefarious DataExfiltration Co & Son

        • tcfhgj 3 days ago

          Fossify gallery

        • ThePowerOfFuet 3 days ago

          This is what the "Allow Network permission" checkbox in the app installation dialog on GrapheneOS is for.

    • numpad0 3 days ago

      uninstall(disable) stock Google Photos app and install `gallery2.apk`. You can download one from sketchy github repos, or I think you can alternatively extract from Emulator image.

      • nine_k 3 days ago

        Why, install a non-sketchy open-source gallery app from F-Droid.

  • TeMPOraL 3 days ago

    What if you're a user and you're fed up with all the "magic"? What if you want a device that works reliably, consistently, and in ways you can understand from empirical observation if you pay attention?

    Apple, Google, Microsoft and Samsung, they all seem to be tripping over each other in an effort to make this whole thing just as much ass-backwards as possible. Here is how it, IMHO, should work:

    1) It scans stuff, detects faces and features. Locally or in the cloud or not at all, as governed by an explicit opt-in setting.

    2) Fuck search. Search is not discoverable. I want to browse stuff. I want a list of objects/tags/concepts it recognized. I want a list of faces it recognized and the ability to manually retag them, and manually mark any that they missed. And not just a list of 10 categories the vendor thinks are most interesting. All of them. Alphabetically.

    3) If you insist on search, make it work. I type in a word, I want all photos tagged with it. I click on a face, I want all photos that have matching face on it. Simple as that. Not "eventual consistency", not "keep refreshing, every 5th refresh I may show you a result", or other such breakage that's a staple experience of OneDrive Photos in particular.

    Don't know about Apple, but Google, Microsoft and Samsung all refuse #2, and spectacularly fail at #3, and the way it works, I'm convinced it's intentional, as I can't even conceptualize a design that would exhibit such failures naturally.

    EDIT:

    4) (A cherry on a cake of making a sane product that works) Recognition data is stored in photo metadata, whether directly or in a sidecar file, in any of a bunch of formats sane people use, and is both exported along with the photos, and adhered to when importing new photos.

    • warkdarrior 3 days ago

      > What if you're a user and you're fed up with all the "magic"?

      This is a completely hypothetical scenario. If users with such requirements actually existed, PinePhones and similar devices would be significantly more popular.

      • TeMPOraL 3 days ago

        It's not hypothetical. Plenty of open source software tries to address it. For example, DigiKam does everything I listed 100% right. Problem is, it's desktop-only and geared for local photos. An equivalent solution could exist for phones and handle cloud albums, but the mobile and cloud vendors don't want to do it, and make it hard on purpose for any third party to try.

      • layer8 3 days ago

        It’s absolutely not hypothetical.

  • plandis 3 days ago

    You can vote with your wallet and get a Pine Phone or something similar, I guess.

  • oulipo 3 days ago

    Well, not vouching for automated scanning or whatever, but the advantage of homomorphic encryption is that besides the power usage for the computation and the bandwidth to transmit the data, Apple doesn't learn anything about what's in your photos, only you can. So even if you don't use the feature, the impact is minimal for you

  • abtinf 3 days ago

    So don’t use the photos app. Just get an alternative camera app and you bypass all of this.

    • thisislife2 3 days ago

      It's opt-in by default so you can't "bypass" it unless you are aware that you can turn it off. If you don't turn it off, it will continue to scan your photos, and upload the data to Apple, whether you use the Photos app or not. (And, by the way, if the option to "learn from this app" is enabled (which is again, by default opt-in) iPadOS / ios also will be intrusively data collecting how you use that alternative camera app too ...

hoppp 3 days ago

Its using Concrete from Zama.

I didn't like their license because it's BSD-3-Clause-Clear but then they state:

"Zama’s libraries are free to use under the BSD 3-Clause Clear license only for development, research, prototyping, and experimentation purposes. However, for any commercial use of Zama's open source code, companies must purchase Zama’s commercial patent license"

So Its not free, you need to pay for patent license, and they don't disclose how much.

I recommend OpenFHE as an alternative Free open source solution. I know its C++ and not Rust, but no patent license and it can do the same thing the blog post wants to do, it even has more features like proxy-reencryption that I think Concrete can't do.

  • nine_k 3 days ago

    How is this "BSD-licensed but only for research" not self-contradictory?

    It's like saying: "FREE* candy! (Free to look at, eating is $6.95 / pound)"

    • gus_massa 3 days ago

      They use the patent loophole. From https://www.zama.ai/post/open-source

      > If a company open sources their code under BSD3-clear, they can sell additional licenses to use the patents included in the open source code. In this case, it still isn’t the software that is sold, but rather the usage rights of the patented intellectual property it contains.

      Every day I like the Apache licence more.

    • nabla9 3 days ago

      BSD+"additional clause" is not BSD.

      Just like 3+1 is not 3.

      • eli 3 days ago

        Wouldn't the patent still be a problem with the standard BSD license? BSD would grant you license to redistribute the software but not necessarily the patent rights to use it.

        • kragen 2 days ago

          The BSD license permits redistribution and use, with or without modification. This permission is conditional on a copyright-related thing (among other things), but there's nothing suggesting that the license is limited to copyright-related rights. If someone grants you a BSD license to their software and then sues you for patent infringement for redistributing or using it, it seems like they'd be likely to lose the lawsuit because they granted you a license to do what you're doing.

          On the other hand, there's nothing explicitly stating that the permission is intended to extend to practicing the patents embodied in the software. That's just an inference any reasonable person would draw from the language of the license. It may be better to state it explicitly, as the Apache license does.

          But it may be worse, because longer licenses contain more to argue over, and once you start listing the particular causes of action you're promising not to sue for, you may miss some. Suppose your program is for chemistry and includes a list of solubility parameters for different compounds. If someone copies that, that's a potential cause of action under national laws implementing the sui generis database rights in the EU Database Protection directive: https://www.eumonitor.eu/9353000/1/j4nvk6yhcbpeywk_j9vvik7m1... which postdates the authorship of the BSD license and isn't mentioned in the Apache License 2.0 either. Plausibly the explicit listing of copyright rights and patent rights in the license will incline courts to decide that no license to database rights was intended.

          Some future legislation will presumably create new intellectual property restrictions we currently can't imagine, even if it also removes some of the ones we currently suffer from.

          (A separate issue is that the patent holder may not be the person who granted the license or indeed have contributed in any way to the creation of the BSD-licensed software, which is just as much of a problem with the Apache license.)

          Issues like these require thoughtful deliberation, and unfortunately the Reddit format adopted by HN makes that impossible—in fact, the editing and replying deadlines added for HN make it a medium even less amenable to such discussions.

          • eli 2 days ago

            Good point. I remembered the 3 clauses correctly but forgot the first sentence mentions "use" along with redistribution. IANAL but it seems common sense that "use is permitted" implies some sort of patent license. Seems tricky though -- do the people I redistribute my modified version to also get a license? Could I avoid needing a patent license for my unrelated project if I embed Concrete?

            Concrete's lawyers must believe that BSD doesn't grant patent rights. The Concrete license.txt is straight BSD, but the Readme says it only applies in certain situations. So is it BSD licensed or not? If that statement about patents in the Readme is load-bearing then what's stopping me from forking the project and removing it?

            • kragen 2 days ago

              The BSD license is available to the general public, so everyone you could redistribute to already has a license.

              In the linked post, they say, "the original BSD3 license did not mention patents at all, creating an ambiguity that the BSD3-clear version resolves", which has an additional clause beginning, "NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS LICENSE." Presumably if Metacarta's lawyers really did believe that BSD doesn't grant patent rights as they claim, they wouldn't have gone to the trouble to edit it to remove that implicit grant. And if Concrete's lawyers really did believe it, they probably would have gone with the actual open-source license everyone recognizes.

  • Ar-Curunir 3 days ago

    Concrete from Zama is a completely different algorithm to the one used in this product; the former uses the CKKS algorithm, while the latter is the BFV algorithm.

    • hoppp 2 days ago

      OpenFHE supports all major FHE schemes, including the BGV, BFV, CKKS, DM (FHEW), and CGGI (TFHE) schemes.

      Copied from openfhe.org

      Or maybe thats not what you meant...

  • commandersaki 3 days ago

    Source? I'm unconvinced. They have been posting stuff about implementing HE primitives in Swift as of last year.

    • j2kun 3 days ago

      Zama has already hit competitors with (French) patent charges. Apple's HE implementation is in Swift and uses BFV, which is very different HE from anything Zama does and doesn't use their source.

      • commandersaki 3 days ago

        Yeah that was what I thought. I've seen their engineers also push for an employment drive for more engineers in the HE space, so I assume they're going to expand its use where applicable, building from the ground up.

j16sdiz 4 days ago

> The two hops, the two companies, are already acting in partnership, so what is there technically in the relay setup to stop the two companies from getting together—either voluntarily or at the secret command of some government—to compare notes, as it were, and connect the dots?

The OHTTP scheme does not _technically_ prevent this. It increases the number parties need to cooperate to extract this information, hoping it would be caught somewhere in the pipeline.

  • chikere232 3 days ago

    and if a government say already forced ISPs to collect metadata about who connects to whom and when, I imagine they don't even need to bother getting data from the relay hosts

ted537 4 days ago

It's cool how neural networks, even convulutional ones, are one of the few applications that you can compute through homomorphic encryption without hitting a mountain of noise/bootstrapping costs. Minimal depth hurrhah!

  • j2kun 3 days ago

    I don't think Apple is doing this. They compute the embedding on device in the clear, and then just do the nearest-neighbor part in HE (which does lots of dot products but no neural network).

    There are people doing NNs in HE, but most implementations do indeed require bootstrapping, and for that reason they usually use CKKS.

sillysaurusx 4 days ago

I was going to make my usual comment of FHE being nice in theory but too slow in practice, and then the article points out that there’s now SHE (somewhat homomorphic encryption). I wasn’t aware that the security guarantees of FHE could be relaxed without sacrificing them. That’s pretty cool.

Is there any concrete info about noise budgets? It seems like that’s the critical concern, and I’d like to understand at what point precisely the security breaks down if you have too little (or too much?) noise.

  • 3s 4 days ago

    SHE vs FHE has nothing to do with security. Instead, it’s about how many operations (eg homomorphic multiplications and additions) can be performed before the correctness of the scheme fails due to too much noise accumulating in the ciphertext. Indeed all FHE schemes are also SHE schemes.

    What typically makes FHE expensive computationally is a “bootstrapping” step for removing the noise that accumulated after X operations and threatening correctness. After bootstrapping you can do another X operations. Rinse and repeat until you finish the computation to you want to perform.

  • bawolff 4 days ago

    Im not an expert on this, but my understanding is "noise" is less a security breakdown and more the entire system breaksdown. That is where the "somewhat" comes in, unlike "full" where the system can do (expensive) things to get rid of noise, in somewhat the noise just accumulates until the system stops working. (Definitely talking out of my hat here)

    • sillysaurusx 4 days ago

      Interesting! Are there any security concerns with SHE? If not, it sounds like all of the benefits of FHE with none of the downsides, other than the noise overwhelming the system. If that’s true, and SHE can run at least somewhat inexpensively, then this could be big. I was once super hyped about FHE till I realized it couldn’t be practical, and this has my old excitement stirring again.

      • Ar-Curunir 4 days ago

        Most FHE schemes are constructed out of SHE schemes. Also, there’s nothing preventing FHE from being practical, it’s just that existing constructions are not as fast we would like them to be

      • bawolff 4 days ago

        My impression is that SHE are still relatively expensive, not as crazy as FHE but still slow enough to preclude many usecases and the noise breakdown can happen relatively quickly making them not work for most algorithms people want to use FHE for.

      • j2kun 3 days ago

        Wait until you see all the ASICs getting taped out by various companies right now.

  • ruined 4 days ago

    it's incredibly algorithm-dependent. if you look into the thesis that originates the 'bootstrapping' technique to transform SHE algorithms into FHE, they determine the noise limit of their specific algorithm in section 7.3 and then investigate expanding the noise limit in 8 and 10.

    (written in 2009) http://crypto.stanford.edu/craig/craig-thesis.pdf

    some newer FHE don't encounter a noise limit or don't use the bootstrapping technique.

    • Ar-Curunir 4 days ago

      All known FHE schemes use bootstrapping

      • ruined 4 days ago

        i expected that, but a search turned up several things claiming to implement fhe without bootstrapping. i didn't investigate and i can't say i'm familiar so maybe they're bogus

  • Ar-Curunir 4 days ago

    SHE doesn’t relax security guarantees, it relaxes the class of supported computations

  • twodave 3 days ago

    Not sure whether they use FHE or not (the literature I’m looking at simply says “homomorphic”), but we use ElectionGuard at our company to help verify elections for our customers. So there are definitely some practical uses.

timsneath 4 days ago
  • GeekyBear 4 days ago

    > One example of how we’re using this implementation in iOS 18, is the new Live Caller ID Lookup feature, which provides caller ID and spam blocking services. Live Caller ID Lookup uses homomorphic encryption to send an encrypted query to a server that can provide information about a phone number without the server knowing the specific phone number in the request.

    Privacy by design is always nice to see.

    • gus_massa 3 days ago

      I don't undertand how it can work. I assume the spam list is shared by all users, oherwise it will no be useful at all:

      Let's supouse Apple is evil (or they recive an order from a judge) and they want to know who is calling 5555-1234

      1) Add a new empty "spam" numbers encrypted database to the server (so there are now two encrypted databases in the system)

      2) Add the encrited version of 5555-1234 to it.

      3) When someone checks, reply the correct answer from the real database and also check in the second one and send the reply to the police.

      • GeekyBear 3 days ago

        > they recive an order from a judge

        You can't be forced to hand over customer data after you have designed a system so that your servers don't ever have that information stored in the first place, court order or no.

      • lilyball 3 days ago

        The reply is encrypted. Apple doesn't know what it says. Neither would the police.

      • jakelazaroff 3 days ago

        How would they decrypt the answer from the database?

AshamedCaptain 3 days ago

There is another reason which I dislike this which is that now Apple has reason for "encrypted" data to be sent randomly or at least every time you take a picture. If in the future they silently change the photos app (a real risk that I have really emphasized in the past) they can now silently pass along a hash of the photo and noone would be the wiser.

If an iPhone was not sending any traffic whatsoever to the mothership, at least it would ring alarm bells if it suddenly started doing so.

  • commandersaki 3 days ago

    Isn't this the same argument that they can change any part of the underlying OS and compromise the security by exfiltrating secret data? Why specific to this Photos feature.

    • cryptonector 3 days ago

      No. GP means that if the app was not already phoning home then seeing it phone home would ring alarm bells, but if the app is always phoning home if you use it at all then you can't see "phoning home" as an alarm -- you either accept it or abandon it.

      Whereas if the app never phoned home and then upon upgrade it started to then you could decide to kill it and stop using the app / phone.

      Of course, realistically <.00001% of users would even check for unexpected phone home, or abandon the platform over any of this. So in a way you're right.

      • commandersaki 3 days ago

        The post also said that now phoning home isn’t an alarm that Apple could subvert the Photos app by passing a hash of the photo (presumably sensitive data). My contention is that Apple could do that for virtually any app that talks to the mothership, and is not unique to Photos.

        • AshamedCaptain 2 days ago

          Which is why I point the dangers of accepting this behavior as normal. I'm assuming you mean they could siphon the hashes of my photos through any other channel (e.g. even when calling the mothership to check for updates), but this is not entirely true. For example, were I to take a million photos, such traffic would suspiciously increase proportionally.

          If you accept that every photo captured will send traffic to the mothership, like the story here, then that is no longer something you can check, either.

          In any case, as others have mentioned, no one cares. In fact, I could argue that the scenario I'm forecasting is exactly what has already happened: the photos app suddenly started sending opaque blobs for every photo captured. A paranoid guy noticed this traffic and asked Apple about it. Apple replied with a flimsy justification, but users then go to ridiculous extremes to justify that this is not Apple spying on them, but a new super-secret-magic-sauce that cannot possibly be used to exfiltrate their data, despite the fact that Apple has provided exactly 0 verifiable assurances about it (and in fact has no way to do so). And the paranoid guy will no longer be able to notice extra per-photo traffic in the future.

  • doublerabbit 3 days ago

    And which they silently do, change the applications. Maps has been updated for me via A/B testing. Messaging too.

    • twodave 3 days ago

      Any app can do this really, just can’t update the entitlements and a few other things. I would think it unlawful for Apple’s own apps to have access to functionality/apis that others don’t…

williamtrask 3 days ago

I love homomorphic encryption, but why can't they just do a neural search locally?

- iOS Photos -> Vectors

- Search Query "Dog photos" -> Vectors

- Result (Cosine Similarity): Look some dog photos!

iPhones have plenty of local storage and compute power for doing this kind of thing when the phone is idle. And cosine similarity can work quickly at runtime.

  • dialup_sounds 3 days ago

    Apparently they only do the cloud HE song and dance for landmarks, which is too big of a data set to realistically keep on-device.

  • internetter 3 days ago

    I discuss this in the post:

    > This seems like a lot of data the client is getting anyway. I don’t blame you for questioning if the server is actually needed. The thing is, the stored vectors that are compared against are by far the biggest storage user. Each vector can easily be multiple kilobytes. The paper discusses a database of 35 million entries divided across 8500 clusters.

  • jerf 3 days ago

    Because the blog post needs some sort of concrete example to explain, but all concrete examples of fully-homeomorphic encryption are generally done better locally at the moment due to the extreme costs of FHE.

  • orf 3 days ago

    That’s what they do

rkagerer 3 days ago

This would be even more exciting if there were some way to guarantee your phone, the servers, etc. are running untampered implementations, and that the proxies aren't colluding with Apple.

  • avianlyric 3 days ago

    If someone or something can tamper with your phone, then nobody needs to collude with proxies or Apple. They can just ask your phone to send them exactly what they want, without all the homomorphic encryption dance.

    The idea that Apple is going to use this feature to spy on you, completely misses the fact that they own the entire OS on your phone, and are quite capable of directly spying on you via your phone if they wanted to.

  • cryptonector 3 days ago

    Upgrades have to be possible. What you want probably is attestation that you're running a generally available version that other users run too as opposed to one specially made for you, but since a version could be made for all those subject to surveillance this wouldn't be enough either.

    I'm not sure there's a way out of this that doesn't involve open source and repeatable builds (and watch out for Reflections on Trusting Trust).

t43562 3 days ago

I never even knew images could be searched this way on a phone and the iPhone users in my family don't either.

A huge privacy-bruising feature for nothing in our case.

  • OkGoDoIt 3 days ago

    I’ve been using it a lot recently. Multiple times even today while I’ve been trying to find just the right photos of my theater for a brochure I’m putting together. I have over 100,000 photos in Apple photos so even if I vaguely remember when I took a photo it’s still difficult to find it manually.

    As a concrete example, someone on my team today asked me “can you send me that photo from the comedy festival a couple years ago that had the nice projections on the proscenium?”. I searched apple photos (on my phone, while hiking through a park) for “sketchfest theater projection”. It used the OCR to find Sketchfest and presumably the vector embeddings of theater and projection. The one photo she was referring to was the top result. It’s pretty impressive.

    It can’t always find the exact photo I’m thinking of the first time, but I can generally find any vaguely-remembered photo from years ago without too much effort. It is pretty magical. You should get in the habit of trying it out, you’ll probably be pleasantly surprised.

  • vrtx0 3 days ago

    Do you mean the ability to search in Apple Photos is “privacy-bruising”, or are you referring to landmark identification?

    If the latter, please note that this feature doesn’t actually send a query to a server for a specific landmark — your device does the actual identification work. It’s a rather clever feature in that sense…

vrtx0 4 days ago

Is the Apple Photos feature mentioned actually implemented using Wally, or is that just speculation?

From a cursory glance, the computation of centroids done on the client device seems to obviate the need for sending embedded vectors of potentially sensitive photo details — is that incorrect?

I’d be curious to read a report of how on-device-only search (using latest hardware and software) is impacted by disabling the feature and/or network access…

  • aeontech 3 days ago

    According to this post on Apple's Machine Learning blog, yes, Wally is the method used for this feature.

    https://machinelearning.apple.com/research/homomorphic-encry...

    • vrtx0 3 days ago

      Thank you! This is exactly the information the OP seems to have missed. It seems to confirm my suspicion that the author’s concerns about server-side privacy are unfounded — I think:

      > The client decrypts the reply to its PNNS query, which may contain multiple candidate landmarks. A specialized, lightweight on-device reranking model then predicts the best candidate…

      [please correct me if I missed anything — this used to be my field, but I’ve been disabled for 10 years now, so grain of salt]

      • chikere232 3 days ago

        The devil is in the proprietary details though.

        • vrtx0 3 days ago

          Sorry, what do you mean by “proprietary details”?

          • sbuk 3 days ago

            They are alluding to the fact that the implementation is closed source, and therefore "untrustworthy". It's a trite point, of course, but not without some merit.

            • vrtx0 3 days ago

              I don’t see any merit, honestly. That would assume one is able to audit every bit of code they run, including updates, and control the build system.

              I mean, the Wally paper contains enough information to effectively implement homomorphic encryption for similar purposes. The field was almost entirely academic ~12 years ago…

              I miss talking shop on HN. Comments like that are why we can’t have nice things.

              • sbuk 3 days ago

                I do agree that everything is politicized. I'd have liked to have seen an explanation for laypeople and perhaps the option being opt-in. To me, there is some merit in that stance. It is a side-note. It is a shame that we can't talk about these things openly without people getting offended because of it.

  • chikere232 3 days ago

    You have to be quick if you want to disable the feature, as the scan starts on OS install, and disabling requires you to actively open the Photos app and turn it off.

antman 3 days ago

Is there any library for e.g. fhm hash table lookup or similar? I have seen papers but have not seen a consensus of what a useful implementatio is

_verandaguy 3 days ago

I'm not an expert in homomorphic encryption by any stretch (and I'm arguably the target audience for this blog post — a curious novice), but there's one thing I don't quite get from this post.

In the "appeal to cryptographers" section (which I really look forward to being fulfilled by someone, hopefully soon!), HE is equated to post-quantum cryptography. As far as I know, most current post-quantum encryption focuses on the elimination of Diffie-Hellman schemes (both over finite fields and over elliptic curves) since those are vulnerable to Shor's algorithm.

However, it's clear from the code samples later in the post (and not explained in the text, afaict) that a public key gets used to re-encrypt the resultant value of a homomorphic add or multiply.

Is this a case of false equivalence (in the sense that HE != post-quantum), or is it more the case that there's some new asymmetric cryptography scheme that's not vulnerable to Shor's?

  • j2kun 3 days ago

    All modern HE schemes rely on post-quantum crypto. For example, the ring-LWE problem used by BFV is the same as Kyber (ML-KEM) but with different parameters.

    The twist in FHE is that the server also has an encryption of the user's secret key, which adds an assumption called "circular security", and that's needed to do some homomorphic operations like key switching.

    • _verandaguy 3 days ago

      Right on, thanks for the explanation!

      So what gets called the "public key" in the blog post is just the (self?-)encrypted secret key from the user?

      I'll read up on your other points after work -- appreciate the search ledes :)

      • j2kun 3 days ago

        The public key is also just like a normal public key, but the encrypted secret key is often called an evaluation key or a key switching key, or some other names. (It's also public in the security sense)

ge96 3 days ago

This is a neat topic I want to get into more myself

Searching encrypted stuff is what I wondered about, in the past I had to decrypt everything before I could use the standard sql search LIKE

Funny post today about cosine similarity

moore1112 2 days ago

You don't need to search much more to find out how to spy or track your boyfriend's phone. Given that males are more likely than women to cheat and that cheating rates are higher than ever, it makes sense that you would want to understand how to track your partner's phone. To spy on your lovers or anybody else, please mail her. Infocyberrecoveryinc@gmail.com

JayShower 3 days ago

This is so cool! I first learned about homomorphic encryption in the context of an election cybersecurity class and it seemed so pie-in-the-sky, something that would unlikely be used for general practical purposes and only ever in very niche areas. Seeing a big tech company apply it in a core product like this really does feel like a step in the right direction towards taking back some privacy.

k__ 3 days ago

Don't they have TEE?

  • j2kun 3 days ago

    Doing it on device would require too much data, and TEEs have side-channel vulnerabilities making it difficult to deploy securely in prod.

renecito 3 days ago

oh, if Apple had a very simple setting to toggle off sending photos to iCloud, oh, wait!!!

sneak 4 days ago

> This should be fine: vectorization is a lossy operation. But then you would know that Amy takes lots of pictures of golden retrievers, and that is a political disaster.

This downplays the issue. Knowing that Alice takes lots of screenshots of Winnie the Pooh memes means that Alice’s family gets put into Xinjiang concentration camps, not just a political disaster.

(This is a contrived example: iCloud Photos is already NOT e2ee and this is already possible now; but the point stands, as this would apply to people who have iCloud turned off, too.)

  • troad 3 days ago

    Agreed. And for a less contrived example, people may have photos of political protests that they attended (and the faces of others present), screenshots that include sensitive messages, subversive acts, etc.

    It's worth noting though that it's now possible to opt in to iCloud Photo e2ee with "Advanced Data Protection". [0]

    [0] https://support.apple.com/en-us/102651

    • sneak 3 days ago

      iCloud Photo e2ee still shares hashes of the plaintext with Apple, which means they can see the full list of everyone who has certain files, even if they all have e2ee enabled. They can see who had it first, and who got it later, and which sets of iCloud users have which unique files. It effectively leaks a social graph.

      It’s also not available in China.

      • troad 3 days ago

        Interesting, good to know.

  • saagarjha 3 days ago

    That's the joke/implication.

RicoElectrico 3 days ago

[flagged]

  • barnabee 3 days ago

    This is about additional new photo search capabilities that are enabled by default and powered by sending (encrypted) data derived from your photos to the cloud, not locally powered AI search.

    • solsane 3 days ago

      To be fair, it sounds like it's both. Local, AI powered 'neural hashing' + SHE

    • sbuk 3 days ago

      *Homomorphically encrypted. Be precise. Deliberately handwaving it away as "mere" encryption paints a worse picture than it actually is. Yes, possibly should have been opt-in, and a more human-friendly explanation of how it works would help. However, it's not nearly as bad as HN would have you believe. Very much a case of them pointing out that the emperor is naked, when in fact they have a few shirt buttons missing...

eviks 4 days ago

> There is no trust me bro element. > Barring some issue being found in the math or Apple’s implementation of it

Yes, is you bar the "trust me bro" element in your definition, you'll by definition have no such element.

Reality, though, doesn't care about your definition, so in reality this is exactly the "trust me bro" element that exists

> But we’re already living in a world where all our data is up there, not in our hands.

If that's your real view, then why do you care about all this fancy encryption at all? It doesn't help if everything is already lost

  • rpearl 4 days ago

    I mean if you'd like, you could reimplement the necessary client on an airgapped computer, produce an encrypted value, take that value to a networked computer, send it to the server, obtain an encrypted result that the server could not possibly know how to decrypt, and see if it has done the computation in question. No trust is required.

    You could also observe all bits leaving the device from the moment you initialize it and determine that only encrypted bits leave and that no private keys leave, which only leaves the gap of some side channel at the factory, but you could perform the calculation to see that the bits are only encrypted with the key you expect them to be encrypted with.

    • Krasnol 4 days ago

      How is this comment useful to the OPs valid arguments?