Natural language processing

Glaze anti-AI image theft tool sees rush of artists following Meta’s plans

Glaze anti-AI image theft tool sees rush of artists following Meta’s plans
Glaze anti-AI image theft tool sees rush of artists following Meta's plans


The free tool Glaze, designed to protect artists from having their style copied by AI image generators, is currently experiencing a dramatic surge in demand. The tool adds imperceptible noise to images to prevent AI systems from imitating the style. According to Ben Zhao, the developer of Glaze, there is a large backlog of access requests for the web-based version WebGlaze, since Meta announced plans to use user data for AI training. Artists sometimes have to wait weeks or months for access – the Glaze Project manually reviews each application to ensure they are real people and the tools are not being abused. At the same time, security researchers have found a way to bypass the protection of Glaze. Although Zhao and his team have made changes to make the attack more difficult, the attack still calls into question the effectiveness of Glaze – especially since the team behind the attack has criticized the changes as insufficient.

Support our independent, free-access reporting. Any contribution helps and secures our future. Support now:

Glaze anti-AI image theft tool sees rush of artists following Meta's plans

Max is managing editor at THE DECODER. As a trained philosopher, he deals with consciousness, AI, and the question of whether machines can really think or just pretend to.

Glaze anti-AI image theft tool sees rush of artists following Meta's plans

Source link