If you pay for Copilot Business/Enterprise, they actually offer IP indemnification and support in court, if needed, which is more accountability than you would get from human contributors.
> If any suggestion made by GitHub Copilot is challenged as infringing on third-party intellectual property (IP) rights, our contractual terms are designed to shield you.
I'm not actually aware of a situation where this was needed, but I assume that MS might have some tools to check whether a given suggestion was, or is likely to have been, generated by Copilot, rather than some other AI.
If you pay for Copilot Business/Enterprise, they actually offer IP indemnification and support in court, if needed, which is more accountability than you would get from human contributors.
https://resources.github.com/learn/pathways/copilot/essentia...
I think that they felt the need to offer such a service says everything, basically admitting that LLMs just plagiarize and violate licenses.
[dead]
9 lines of code came close to costing Google $8.8 billion
how much use do you think these indemnification clauses will be if training ends up being ruled as not fair-use?
Are you concerned that this will bankrupt Microsoft?
I think they're afraid they will have to sue Microsoft to get them to abide by the promise to come to their defense in another suit.
be nice, wouldn't it?
poetic justice for a company founded on the idea of not stealing software
That covers any random contribution claiming to be AI?
Their docs say:
> If any suggestion made by GitHub Copilot is challenged as infringing on third-party intellectual property (IP) rights, our contractual terms are designed to shield you.
I'm not actually aware of a situation where this was needed, but I assume that MS might have some tools to check whether a given suggestion was, or is likely to have been, generated by Copilot, rather than some other AI.