

7·
12 days agoIt looks like this is already in the plans: https://www.cbc.ca/news/canada/british-columbia/doctors-recruitment-1.7480911
It looks like this is already in the plans: https://www.cbc.ca/news/canada/british-columbia/doctors-recruitment-1.7480911
I’m not European, but I understand that there’s an old European (German?) saying that basically goes: “If I had wheels, I’d be a trolley.” I understand that it’s been pretty well-established that AI coding tools routinely underperform compare to humans in terms of “better” and “safer”, which indirectly would also lead to it failing at “cheaper” too.
On top of that, there is another major issue with using AI for open-source code: copyright. First, you don’t know if the code that you’re adding through AI may be copying license-incompatible code verbatim. Because everyone has access to open-source code, it would be trivial for anyone to search and find copyright-infringing code to attack projects with. Second, the code that AI produces is also not-copyrightable, so that is another line of attack that this would make open-source projects vulnerable to. These could be used in combination as a one-two punch combination to knock out an open-source project.
I think that using AI-generated code in open-source projects is a uniquely ill-advised idea.