Adobe responds to controversy over terms of use, saying it doesn’t spy on users

Adobe has released a new blog post explaining the changes to its terms of use, when Adobe applications can access a user’s content, and whether a user’s content will be used to improve its artificial intelligence (AI) models and services. Adobe to train.

The need for clarification arose after numerous users, including some established creative professionals, received pop-up notifications in Adobe applications stating, among other things, that Adobe could access user content through automated and manual methods. The resulting anger among the creative community is easy to understand.

The pop-up, which required permission for an individual to continue using Adobe software, did not explain exactly what had been updated in the terms of use and how Adobe can gain access to an individual’s content. Adobe’s opacity left the door open to speculation, confusion and fear.

One of the most prominent concerns was that Adobe would essentially spy on someone’s work. This is concerning in general, but especially for those who use Adobe software to create work for NDA projects. Given Adobe’s prominent position in the creative software segment, this fear applies to many people.

There were also concerns about Adobe claiming ownership of someone’s work, which is very shady considering the way some Adobe Creative Cloud services work. No, Adobe ultimately does not own work that someone creates in Creative Cloud apps or uploads to Adobe platforms, but Adobe must have some form of license to provide specific services. Some rather standard legal texts tucked away in terms of end-user use and license agreements, even though they are common to many services with asset uploading and sharing tools, can seem daunting to those who read them.

It wouldn’t be an Adobe controversy without people wondering if Adobe trains its Firefly AI using customer content, so those understandable concerns cropped up again.

“We recently updated our terms of use with the aim of providing more clarity in a few specific areas and have pushed out a routine re-acceptance of those terms to Adobe Creative Cloud and Document Cloud customers. We have received a number of questions regarding this update and want to provide some clarity,” Adobe writes in its new blog post. “We remain committed to transparency, protecting the rights of creators and enabling our customers to do their best work.”

As mentioned in the pop-up that caused the outrage this week, Adobe has updated the language in sections two and four of the terms of use. The precise changes Adobe made are detailed in the blog post, but the most significant revisions concern sections that stipulate that Adobe may access, view, or listen to user content in “limited ways and only as permitted by law.” Reasons for this include responding to customer feedback and support, identifying and preventing legal and technical issues, and enforcing content terms, such as those prohibiting the use of Adobe software to create child sexual abuse material (CSAM ) to make.

Adobe provides more details about its content moderation policies on a separate section of its website dedicated to transparency.

A modern office with brick walls and large windows.  Several people work at a desk with computers, while others stand and talk.  There are factories and office equipment everywhere, creating a collaborative and busy work environment.

“For the avoidance of doubt, Adobe requires a limited license to access Content solely for the purpose of operating or improving the Services and Software and to enforce our terms and comply with law, such as protection against unlawful content,” Adobe continues.

The company outlines three instances in which Adobe applications and services can access user content. These include when access is required to provide essential services and functions, such as when opening and editing files for the user or creating thumbnails or previews for sharing.

Access is also required to provide certain cloud-based features, including Photoshop’s neural filters, liquid mode, or background removal. People can learn more about how to view and analyze content in these cases in Adobe’s Content Analytics FAQ. For those working on sensitive, confidential material, it may be worth considering the limited situations in which Adobe be able to watch that content, including with real people.

Finally, Adobe may have access to content that is processed or stored on Adobe servers. In these cases, Adobe may, automatically or with the help of humans, screen for certain types of illegal content (such as CSAM).

Adobe reaffirms that it “does not train Firefly Gen AI models on customer content.” Firefly is trained in the use of licensed content, such as Adobe Stock media and public domain content.

Further, Adobe says it will “never take ownership of a customer’s work.”

“Adobe hosts content so that customers can use our applications and services,” the technology giant explains. “Customers own their content and Adobe does not accept any ownership of customer work.”

“We appreciate our customers who have contacted us to ask these questions, which has given us the opportunity to clarify our terms and our obligations. We will clarify the acceptance of the terms of use that customers will see when opening applications,” concludes Adobe.

Hopefully these changes will reach customers sooner or later, because it is easy to understand how this situation has developed so quickly. Without the context that Adobe failed to include in its pop-up message, some standard terms of use seemed anything but.


Image credits: Adobe

Leave a Comment