In June, we introduced the topic of chatbots and highlighted some key risks and concerns associated with this growing area of technology.  One business in particular, DoNotPay, made headlines recently by announcing that it would begin building legal chatbots for free.

The claim? In a July 14, 2017, posting to the online publishing platform Medium, Joshua Browder, founder of UK-based DoNotPay, writes, “Starting today, any lawyer, activist, student or charity can create a bot with no technical knowledge in minutes.  It is completely free.”  Sound too good to be true?  To be sure, DoNotPay is not the first company to develop law-related chatbots—these bots are already popping up all over the world.  But because this technology is still fairly new, chatbots that are attempting to automate services previously performed by licensed attorneys will almost certainly attract scrutiny. 

While platforms such as DoNotPay and ChatFuel seek to lower barriers to entry by providing free tools to get a chatbot up and running, corporations and attorneys who want to integrate these chatbots into the realm of legal services could create costly legal exposure. Before submitting your idea for automating legal services, it is important to stop and consider a few threshold questions:

  1. Who owns “your” bot? Property rights in the digital realm can be a murky area, especially when a user’s original content is mixed with a digital platform’s intellectual property.  Previously on the blog we have discussed US IP ownership concerns as they relate to APIs, Instagram, and Facebook “likes.”  Generally, the terms and conditions of the platform will address ownership rights and define the relationship between platform and user.  Currently, DoNotPay does not appear to require that a user agree to any particular terms and conditions when submitting an idea for a chatbot.  In the July 14 Medium posting, Browder asks and answers the ownership question this way: “Who owns the rights to the document?  You do.  However, you give us permission to make a bot for you and send the link.”  Would this simple statement be sufficient under US law to allow you to claim a copyright interest in the resulting software code that DoNotPay generates?  The answer is far from certain.  What if a DoNotPay employee were to take your submission, develop it into a commercial product, and then monetize that product without your consent or involvement?  Would you have any recourse?  Again, there is very little certainty.

 

  1. Once the bot is up and running, who will own the input from users? Legal chatbots may need to collect sensitive personal data from the intended users. In this case, who owns (and is legally responsible to protect) the personal information collected by the bot?  Privacy compliance should be a serious concern, as this is an area of heavy regulation that is fraught with liability.  In today’s digital economy, “big data” and the “internet of things” are valuable commodities.  Will the platforms claim any right to profit from the information collected by these bots?  Are these types of arrangements being properly disclosed to the intended users so that consumers understand how their personal information is being used?  These issues should be carefully addressed with the platform host upfront.

 

  1. Does this chatbot put you at risk for the unauthorized practice of law? In the legal realm, chatbots are sometimes praised for adding an element of efficiency to routine legal documents with a limited number of variables provided by the client. But this kind of legal document production could violate a jurisdiction’s unauthorized practice of law (“UPL”) rules.  Most, if not all, jurisdictions require legal service providers to be licensed, but what constitutes UPL varies greatly from jurisdiction to jurisdiction.  For instance, a jurisdiction could provide that a deed conveying an interest in real property may only be prepared by an attorney licensed in that jurisdiction.  In the U.S., the American Bar Association reported in 2012 that 23 states are actively enforcing UPL rules.  Courts are also expressing skepticism for attempts at automating the generation of legal documents. See, e.g., Janson v. LegalZoom.com, Inc., 802 F. Supp. 2d 1053, 1065 (W.D. Mo. 2011) (“A computer sitting at a desk in California cannot prepare a legal document without a human programming it to fill in the document using legal principles derived from Missouri law that are selected for the customer based on the information provided by the customer.”).  Broadly speaking, the more interactive the service in question, the higher the risk that the service may run afoul of UPL rules.  And this sort of personalized interaction seems to be a primary goal of the chatbot technology.

The questions raised in this post are just the tip of the iceberg. Chatbots certainly have the potential to be a major technological disrupter in the legal industry, but there remains a host of risks to consider before integrating this technology into practice.