Hogan Lovells 2024 Election Impact and Congressional Outlook Report
This article explores certain key differences that should be taken into consideration when negotiating for the use of generative AI software platforms, compared to more traditional software offerings. For the sake of providing clear examples, this article compares the negotiation of (i) generative AI software platforms (whether SaaS-based or install-based) and (ii) SaaS and install-based commercial off-the-shelf (COTS) platforms utilizing no AI (or limited, non-generative AI models).
Software-as-a-service (SaaS) and other software platforms powered by artificial intelligence (AI) are increasingly indispensable tools for businesses seeking to enhance and streamline their operations and services. Basic machine learning, natural language processing, and other AI algorithms (such as those powering text editors, autocorrect, customer service chatbots, search engine optimization, and inventory management systems) are commonplace in modern software. As a result, businesses often rely on traditional software licensing principles to negotiate AI software license agreements, without giving much thought to AI-specific issues. While this approach may still work for some software platforms on the market, software platforms leveraging generative AI (that is, AI capable of generating new content like text, images, audio, or other media) require unique considerations.
Traditional SaaS and COTS Platforms: SaaS and COTS platforms are often standardized products, with limited customization options. Unless a customer has a specific business need (and enough leverage) to obtain its own instance of a SaaS platform or a customized version of a normally COTS platform, all updates made by the provider will be incorporated into the version of the platform accessed by each user of the platform. Some SaaS and COTS platforms include functionality allowing users to configure the software within predefined parameters, but substantial customization is usually not possible with SaaS and COTS platforms. Providers of SaaS and COTS platforms also tend to offer standardized services, support, and uptime commitments, with minimal room for material alterations for any mature SaaS or COTS platform.
Generative AI: Unlike most SaaS platforms, customization is a staple of many non-generative AI products AI providers that offer customization can also work closely with users to adapt their AI models, algorithms, and data processing techniques to address specific business challenges or use cases or to develop unique AI software for a particular organization. As such, customization can be a key issue when negotiating AI software contracts, as the contract needs to align with the unique requirements and objectives of the user. Where appropriate, this may include defining the inputs that can be utilized by an AI model, with many users instituting limitations on the software’s ingestion of data and other materials to the user’s own data and other materials.
Traditional SaaS and COTS: IP ownership provisions in SaaS and COTS contracts are typically limited to the software itself. In a standard SaaS or COTS contract, the SaaS or COTS provider retains ownership of the underlying software and the user is granted a limited right to access and use the software, subject to standard use restrictions and other specific terms and conditions. The user typically retains ownership of its data, including any data it uploads or submits to the platform (although some SaaS or COTS providers seek ownership of or perpetual rights in user data, usually in an aggregated and/or deidentified format, for purposes of providing services to the user or improving products and services). IP rights related to any data generated or processed by SaaS or COTS platforms may vary and are often explicitly addressed in the contract.
Generative AI: IP and data ownership in generative AI software contracts can be more complex. In addition to ownership of the underlying software, contracts covering the use of generative AI software should include provisions allocating ownership of any AI-generated content (including copyrightable material like text, images, and software). Depending on the business deal, it may also be appropriate for the agreement to define ownership of algorithms, custom developments, and the data used to train AI models. For example, if generative AI software has been fed or trained using private data (such as a proprietary provider or user database) the agreement should explicitly define ownership of such data. Negotiating appropriate IP and data ownership terms is critical in generative AI software contracts, because, among other aspects, these terms (or the lack thereof) can impact the user's ability to use and further develop AI-generated assets. However, IP ownership related to AI-generated content under United States law is still in flux. Even if a user is able to negotiate ownership of AI-generated content, such ownership may be solely between the contracting parties, and enforcement of such ownership against a third party may not be permitted under current case law.
Traditional SaaS and COTS: SaaS and COTS platforms may include, use, or make available third-party IP or technology. In a standard SaaS or COTS contract, the provider is generally responsible for ensuring its SaaS or COTS platforms are non-infringing, compliant with law (including data security and privacy law), and free from viruses and other harmful code. However, the extent of the provider’s warranties and obligations around these issues can vary based on the user’s negotiating leverage.
Generative AI: Generative AI software may incorporate third-party IP, models, or datasets and produce content (including copyrightable material, such as text, images, designs, and software) based on those third-party materials. As such, generative AI software contracts must address third-party IP rights and the potential impact on usage, including licensing terms and usage rights, as well as appropriate warranties and indemnities. The generation of content based on third-party materials, in particular, represents an increased risk of use of generative AI, as the law regarding whether utilizing third-party works to train AI models constitutes “fair use” under U.S. copyright laws is not yet settled. There can be significant risk that AI-generated content could infringe third-party IP, including, for example, if proper limitations are not established on training data sets. Furthermore, users may wish to establish quality controls for the data inputs into AI software, as poor-quality data inputs will result in unreliable or incomplete AI models.
Traditional SaaS or COTS: SaaS and COTS contracts typically do not involve the same ethical considerations as AI software contracts due to the limitations on the software architecture of any particular SaaS or COTS platform. SaaS or COTS platforms tend to rely solely on limited data sets input by a user, rather than any third-party data or IP that could reflect inherent biases in the inputs.
Generative AI: Ethical and responsible AI development and usage are pressing concerns, particularly as AI technologies evolve. Responsible AI usage will depend on the use case for the AI and the related inputs that the AI models are ingesting and utilizing for generation of outputs, as the data sets utilized can contain inherent stereotypes or biases that would then be reflected in the resulting output. For example, if AI software is used to assist a human resources department in making hiring decisions, the human resources department must be aware of potential biases in the training data that could influence its hiring decisions, which may be in violation of applicable employment laws. AI software contracts should establish guidelines to ensure compliance with evolving ethical standards and regulatory requirements.
Authored by David Toy, John Fraczek, and Thomas Petrie