- January 8, 2024
- Posted by: Larry Walsh
- Category: Blogs
Vendors and partners want the benefits of using artificial intelligence in customer support, but how that information gets sourced is a murky issue.
By Larry Walsh
The world marveled at the rapid advances in artificial intelligence in 2023. The release of ChatGPT and the proliferation of AI-driven applications and features fueled imaginations about how this technology could transform numerous industries.
AI is useful, but it’s not independently intelligent. It requires base material to train its algorithms and develop a body of knowledge (such as it is) to create its responses. Where that material comes from is now an open question.
Last year, authors John Grisham and George R.R. Martin and comedians Jonathan Franzen and Sarah Silverman sued OpenAI (parent company of ChatGPT) and other technology vendors for violating their copyrights by using their material to train large language models (LLMs).
Over the holidays, The New York Times sued OpenAI and Microsoft for using the newspaper’s content to train chatbots. The lawsuit, which seeks to stop the practice and potentially earn compensation for alleged past violations, asserts that the AI tools offered by the companies are competing with the producers of the material they were trained on.
In the channel, vendors and partners are adopting AI tools to augment their customer service and technical support capabilities. Rather than having an army of expensive, highly trained technicians, companies are using chatbots as the front line of customer interaction. Through these online tools, customers can easily gain answers to most technical questions. For higher levels of support, the same tools are making even novice technicians effective at solving problems.
A study by Stanford University’s Digital Economic Laboratory and the Massachusetts Institute of Technology found that customer support agents using generative AI are 14% more productive, on average, than those acting self-sufficiently. A report by Zendesk, a provider of support ticketing applications, found that customers using its AI applications have 30% faster problem resolution times.
Vendors make copious products and technical materials available to partners through their channel programs. Much of this material is publicly available on the Internet. More advanced and specific material is available through gated partner portals. Vendors publish guidelines and limitations about the use of their brands and marketing materials through such vehicles as terms of service and partnership agreements. However, the actual allowable uses aren’t always clear — particularly to AI developers and trainers several steps removed from partner programs.
The high-profile copyright lawsuit of Oracle v. SAP illustrates potential perils when third parties access proprietary content without clear permission. In 2007, Oracle sued SAP alleging its Texas-based subsidiary, TomorrowNow, used customer credentials to systematically download volumes of technical manuals and support documents from Oracle’s partner portal.
Oracle claimed TomorrowNow lacked licenses for some downloaded applications. Though having some legitimate access, SAP ultimately admitted to violations. After seven years of litigation, SAP paid Oracle $358 million in damages in 2014.
This case highlights the liabilities possible when IP rights become murky between vendors and partners, underscoring the importance of clear guidelines as advanced technologies like AI necessitate pooling vendor-sourced data. Vendors must safeguard competitive advantages and compliance. Meanwhile, partners need assurances that they can leverage information without inviting lawsuits.
Clear guidelines could benefit vendors and partners alike. Vendors reasonably want to protect their intellectual property and maintain a competitive edge. Meanwhile, partners need to understand what usage crosses lines. A lack of clarity can lead to legal conflicts or to partners’ avoidance of AI applications beneficial to vendors out of an abundance of caution.
The path forward lies not in assumptions or accusations but in transparent discussion around limitations that protect while continuing to encourage partner innovation through AI. Vendors should review their terms and conditions relative to the use of their copyrighted materials and provide partners with the guidelines and limitations needed to protect everyone’s interests. At the same time, they need to enable the use of AI to improve customer support. The last thing the industry needs is a proliferation of lawsuits such as the one The New York Times has filed.
Larry Walsh is the CEO, chief analyst, and founder of Channelnomics. He’s an expert on the development and execution of channel programs, disruptive sales models, and growth strategies for companies worldwide. Follow him on Twitter at @lmwalsh_CN.