ChatGPT3 is a chat engine that fulfils the promises of an AI-based chat engine: users can ask a question (prompt) and it answers in a reasonable manner. The coding-related skills of ChatGPT are especially impressive: informal testing shows that it is difficult to find simple questions that ChatGPT3 does not know how to answer properly. Some students are certainly already using it to answer programming assignments. This article studies whether it is safe for students to use ChatGPT3 to answer coding assignments (safe means that they will not be caught for plagiarism if they use it). The main result is that it is generally not safe for students to use ChatGPT3. We evaluated the safety of code generated with ChatGPT3, by performing a search with a Codequiry, a plagiarism detection tool, and searching plagiarized code in Google (only considering the first page of results). In 38% of the cases, Codequiry finds a piece of code that is partially copied by the answer of ChatGPT3. In 96% of the cases, the Google search finds a piece of code very similar to the generated code. Overall, it is not safe for students to use ChatGPT3 in 96% of the cases.
翻译:ChatGPT3是一款满足基于人工智能的聊天引擎承诺的聊天引擎:用户可以提出问题(提示),它会以合理的方式回答。ChatGPT在编程相关技能方面尤其令人印象深刻:非正式测试表明,很难找到ChatGPT3无法正确回答的简单问题。一些学生肯定已经在使用它来完成编程作业。本文研究了学生使用ChatGPT3回答编程作业是否安全(安全意味着如果他们使用它,不会被判定为抄袭)。主要结论是:学生使用ChatGPT3通常并不安全。我们通过使用抄袭检测工具Codequiry进行搜索,并在Google中搜索抄袭代码(仅考虑搜索结果的第一页),评估了ChatGPT3生成代码的安全性。在38%的情况下,Codequiry发现ChatGPT3的答案部分复制了某段代码。在96%的情况下,Google搜索找到了与生成代码高度相似的代码片段。总体而言,在96%的情况下,学生使用ChatGPT3并不安全。