(T/n: pic not included. Backstory here)
Because they are using it for s*xual harassmentㅇㅇ
I bet that South Korea is just under a ticking bomb before our celebrities get s*xually harassed with AI edits too. People are using AI so recklessly
1. Just the deepfake edits that people in our country are making in the dark are such a sight
2. There are also edits of Biden… to dissuade people from voting for him..
3. Korea has always had a severe problem with deepfakes…ㅠ
4. Is there an AI detection system… When science advances, there should be a device that goes against it
5. I feel like this issue is already so severe within Korea… just looking up on Twitter makes me wanna vomit
6. In a situation where legal regulations regarding AI-related copyrights are not properly established yet, if the spread of voices distortion or editing software becomes culturally normalized before proper laws, it will eventually lead to a major accident that will prompt legal enactmentㅠ This is seriously a big problem. There are already societal issues arising because of deepfakes and at the rate that our generation is evolving, the government just cannot catch up…
7. I think that we need regulations and fast but the members of the National Assembly are all so old that I bet they don’t even know what AI is
8. Even K-pop has so many deepfakes already…
Leave a Reply
You must be logged in to post a comment.