• tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    Kids “easily traceable” from photos used to train AI models, advocates warn.

    I mean, that’s true, and could be a perfectly-legitimate privacy issue, but that seems like an issue independent of training AI models. Like, doing facial recognition and such isn’t really new.

    Stable Diffusion or similar generative image AI stuff is pretty much the last concern I’d have over a photo of me. I’d be concerned about things like:

    • Automated inference of me being associated with other people based on facial or other recognition of us together in photos.

    • Automated tracking using recognition in video. I could totally see someone like Facebook or Google, with a huge image library, offering a service to store owners or something to automatically identify potential shoplifters if they let them run automated recognition on their store stuff. You could do mass surveillance of a whole society once you start connecting cameras and doing recognition.

    • I’m not really super-enthusiastic about use of fingerprint data for biometrics, since I’ve got no idea how far that is traveling. Not the end of the world, probably, but if you’ve been using, say, Google or Apple automated fingerprint unlocking, I don’t know whether they have enough data to forge a thumbprint and authenticate as you wherever else. It’s a non-revocable credential.