Skip to main content
search
0

Image credit: geralt. Pixabay license.

By Frank Gallagher, NANPA Marketing, Communications & Blog Coordinator

Artificial intelligence (AI)applications are cropping up everywhere. Adobe has its “sensei,” the AI and machine learning technology that operates across Adobe products. Topaz Labs has a suite of photo processing products that use AI. It’s everywhere you look. But, did you know that AI models are trained through examining millions of images scraped from the Internet and other sources? Or that your images could be among them? Or that parts or all of your image might be used in an image created by AI?

As we noted back in September, there is a lot of haziness around copyright and intellectual property rights in AI applications. Apps like Midjourney, DALL-E2 and Stability AI’s Stable Diffusion use volumes of images they find on the web to train the apps. These images may include some that are copyright protected, but appear online. Up to now, the AI apps made no effort to determine whether the art they were scraping from the Internet was copyrighted or not. According to Digital Camera World, the CEO of Midjourney admitted that they don’t check the images they use for copyright. “We weren’t picky,” he said. Not only could your images be used to train the AI models, and AI-created images devalue your photos, but elements of your images used for training AI apps might also be incorporated into an AI-created work.

According to an article on Vice, one popular dataset (LAION-5B) frequently scraped for AI training contains more than five billion images, from photoshopped celebrities to pornography, and from medical imagery to nature and portrait photos. There is no good way to opt out of being in large datasets.

So, what’s a photographer to do?

The Copyright Alliance’s January 2 newsletter links to a story on Venture Beat reporting that “Stable Diffusion will honor artists’ requests to opt out of the training of Stable Diffusion 3.” Photographers and artists can go to haveibeentrained.com, a website makes it easy to see if an image has been used to train AI apps. ArsTechnica explored how this works.

Stability AI states that it’s not offering opt outs for legal or ethical reasons but, rather, because there’s “no reason to particularly not to do it.” The company also feels that most people would eventually prefer to opt in rather than out. We’ll see.

It’s possible that new models of managing the big datasets used for training AI apps will arise, but no guarantee. There will be more to come as companies and individuals grapple with issues created by AI.

 

Thanks to Sean Fitzgerald, co-chair of NANPA’s Advocacy Committee who passed along this information from the Copyright Alliance.

NANPA’s advocacy

NANPA fights for the IP rights of nature photographers through its Advocacy Committee and in collaborations with organizations like the Copyright Alliance and Coalition of Visual Artists. Learn more about NANPA’s advocacy work here.

Frank Gallagher is a landscape and nature photographer based in the Washington, DC, area who specializes in providing a wide range of photograph services to nonprofit organizations. He serves as NANPA’s Interim Marketing and Communications Coordinator and manages NANPA’s blog. He can be found online at frankgallagherphotography.com or on Instagram @frankgallagherfoto.