this post was submitted on 14 Jul 2023
497 points (97.5% liked)

Asklemmy

43973 readers
807 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] aes_256@lemmy.world 2 points 1 year ago (1 children)

Copilot was trained on copylefted code while itself being closed. What was brought to attention by @ralC@lemmy.fmhy.ml isn't efficacy, but Microsoft's lack of ethics and social responsibility when it comes to their bottom line.

[–] tool@r.rosettast0ned.com 0 points 1 year ago (1 children)

Copilot was trained on copylefted code while itself being closed. What was brought to attention by @ralC@lemmy.fmhy.ml isn’t efficacy, but Microsoft’s lack of ethics and social responsibility when it comes to their bottom line.

I honestly don't have a problem with that. Everything that it was trained on is publicly-available/open-source code, and I'm not aware of any license that requires you to distribute your modifications if you don't make modified binaries publicly available, not even GPL. And even then, you're only required to make available the code that was modified, not related code. And I don't even think that situation would apply in this case, since nothing was modified, it was just ingested as training data. Copilot read a book, it didn't steal a book from the library and sell it with its name pasted over the original author's.

This isn't really any different of a situation than a closed-source Android app using openssl or libcurl or whatever. Just because those open-source libraries were employed in the making of the app doesn't mean that the developer must release the source for that app, and it doesn't make them a bad person for trying to make money from selling that app. Even Stallman is on board with selling software.

And even if you take all that off the table, you're free to do the exact same thing and make a competitor. Microsoft didn't make their own language model, they're using a commercially-available model developed by OpenAI. There's literally nothing stopping anyone else from doing this as well and making a competing service called "Programming Pal" and making their code open-source. In fact, it's already been done with FauxPilot and CodeGeex and the like.

So yeah, I really don't have a problem with it. This ended up a lot longer than I had originally thought it would, sorry for the novel.

[–] aes_256@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

I'm not going to reinvent the wheel here when people more invested in the topic than myself, including the Software Freedom Conservancy, have written detailed papers showcasing different perspectives on the legal and moral implications of Copilot and its business model. There's also currently a class-action lawsuit against GitHub for the service.

[–] tool@r.rosettast0ned.com 2 points 1 year ago

Yep. I'm not making a proclamation, just stating an opinion. I don't have a problem with what they're doing, and if other people do, that's fine. Some people like their cucumbers pickled, let them have their pickle.

I actually wouldn't be surprised to see it go open source in the future, Microsoft has been doing that a lot recently, like VScode and the whole of .NET and friends like PowerShell. Pretty much the only things worthwhile from Microsoft are already open source, except Copilot.