A Concrete Example (Destiny)

by EffortlessFury @, Wednesday, February 08, 2023, 09:32 (436 days ago) @ EffortlessFury
edited by EffortlessFury, Wednesday, February 08, 2023, 09:35

GitHub Copilot is an AI-assist tool that aids in the coding process to suggest code to help you accomplish the goals it believes you are attempting to achieve. Source code is distributed under a license (and a lack of license, i.e. the default, is not permissive). Different source code repositories can be reused to varying degrees based upon various conditions.

The major problem is that GitHub Copilot was trained on repos with various licenses with varying degrees of reuse permissions. Github Copilot can be used to develop software with licenses that are less permissive than the repos it was trained on.

Now, let's step back for a moment to the human factor. I, a developer, can look at the source code of a project and observe the implementation of a solution to a problem. Assuming a restrictive license, I cannot reuse that code verbatim, but I can attempt to reimplement the same general process in my own way. What's the distinction? Well, that is considered, by and large, a grey area. Because the number of ways to solve a problem, especially efficiently, are limited, there's not much room for reinventing the wheel. It is likely that multiple independent implementations of a solution to the same problem will look similar. Therefore, this transformative reuse will happen and is ultimately an unavoidable byproduct of open sourcing code.

Bringing this back to Copilot: Copilot does not source its recommendations. How could it, after all? It is an amalgamation of all that it has been trained upon. It no longer has the context of where that particular solution came from. Copilot has been shown to regurgitate entire sections of code from repos, so it is not always creating something original; were a developer to take one of these larger suggestions and implement them by manually observing the repo, this would be considered illegal. However, because Copilot scrubs the context of its suggestions, this is much harder to prove and enforce.

At the end of the day, Copilot is essentially money laundering for code. It takes what would be illegal reuse and makes it "clean." The entire reason why the aforementioned transformative reuse is generally acceptable is that a human assesses the license of the source code, understands the conditions under which it can be used, and properly reuses and/or transforms it based upon those conditions. A human must consider the legal and ethical ramifications of that reuse. Trained AI does not do this.

That is why there is a pending lawsuit against Copilot. While the different potential products of AI generators have varying degrees of transformation depending on the type of product and the training set, it all exsists in this gray area because it is not consciously aware of the ethics of what it is doing. There is no human element making a judgement call on how far they should take their transformative work, and there is no one to hold accountable if the product would be considered unethical were a human to do it themselves.


Complete thread:

 RSS Feed of thread