There’s a website displaying the output from where two machine learning programmes are facing off against each other. One a photo generator and the other a photo discriminator. Each algorithm improves using the output from the other.
The website itself displays a new image each reload, the image is completely computer generated.
well i wasn’t sure until this lass popped up…
Worked in IT for years. AI is immensely interesting. In some ways it’s actually quite crap. Alexa for example, not much more than a natural language processor. All the clever stuff is still done by manual programming etc.
But there is some properly interesting stuff out there. Especially when big cloud providers are giving access to the tech at a very low price.
I’ve been messing about with AWS and IoT with Raspberry pi. 30 quid on hardware and a bit of programming and you can build a camera that can recognise faces and when it cant name them cant tell you their sex, age range, and even their mood. Costs fuck all to process each picture.
As long as the people are neurotypical white folk with gender conforming hair-styles, no tattoos, birth marks, piercings etc.
Dunno, it seems to do a range of race, gender etc.
It’s generation of children/teens is well… a little creepy.
Yes its not looking at long hair to decide if its female, its much more advanced than that - it uses the distances between certain points on a face
I read a piece a while back though about how some tech had ingrained racism in it. Kodak camera has a processor that would say someone had their eyes closed in that picture, want to retake? Which it did every time someone took a picture of an Asian person. The sensors on hand dryers dont work with dark skin as they were configured with fair skin.
I think that the inherent bias in software coding is very real and an interesting challenge that needs addressing.
I’ve spoken to my wife (a doctor) about the racism ingrained in medical training. For example doctors are trained to recognise skin conditions on fair skin and have much more limited information on the diseases more prevalent in minority communities.
Wanna buy some fake people?