this post was submitted on 03 Sep 2024
1570 points (97.7% liked)

Technology

59612 readers
3001 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] SankaraStone@lemmy.world 1 points 2 months ago (2 children)

Isn't copyright about the right to make and distribute or sell copies or the lack there of? As long as they can prevent jailbreaking the AI, reading copyrighted material and learning from it to produce something else is not a copyright violation.

[–] prototype_g2@lemmy.ml 3 points 2 months ago (1 children)

I don't think you understand exactly how theses machines work. The machine does not "learn", it does not extract meaning from the tokens it receives. Here is one way to look at it

Suppose you have a sequence of symbols: ¹§ŋ¹§ŋ¹§ŋ¹§ŋ And then were given a fragment of a sequence and asked to guess what you be the most likely symbol to follow it: ¹§ Think you could do it? I'm sure you would have no trouble solving this example. But could you make a machine that could reliably accomplish this task, regardless of the sequence of symbols and regardless of the fragment given? Let's imagine you did manage to create such a marvellous machine.

If given a large sequence of symbols spanning multiple books of length would you say this pattern recognition machine is able to create anything original? No... Because it is simply trying to copy it's original sequence as closely as possible.

Another question: Would this machine ever derive meaning from this symbols? No... How could it?

But what if I told you that these symbols weren't just symbols: Unbeknownst to the machine each one of this symbols actually represents a word. Behold: ChatGPT.

This is basically the general idea behind generative AI as far as I'm aware. Please correct me if I'm wrong. This is obviously oversimplified.

[–] SankaraStone@lemmy.world 1 points 2 months ago

Yeah, all training ends up being pattern learning in some form or fashion. But acceptable patterns end up matching logic. So for example if you ask ChatGPT a question, it will use its learned pattern to provide its estimate of the correct ouptut. That pattern it's learned encompasses/matches logical processing of the user input and the output that it's been trained to see as acceptable output. So with enough training, it should and does go from simple memorization of individual examples to learning these broad acceptable rules, like logic (or a pattern that matches logical rules and "understanding of language") so that it can provide acceptable responses to situations that it hasn't seen in training. And because of this pattern learning and prediction nature of how it works, it often "hallucinates" information like citations (creating a novel citation matching the pattern its seen instead of the exact citation that you want, where you actually want memorized information) that you might ask of it as sources for what its telling you.

[–] SankaraStone@lemmy.world 0 points 2 months ago

I'm less worried about a system that learns from the information and then incorporates it when it has to provide an answer (ex. learning facts) than I am of something that steals someone's likeness, something we've clearly have established people have a right to (ex. voice acting, action figures, and sports video games). And by that extension/logic, I am concerned as to whether AI that is trained to produce something in the style of someone else, especially in digital/visual art also violates the likeness principle logically and maybe even comes close to violating copyright law.

But at the same time, I'm a skeptic of software patents and api/UeX copyrighs. So I don't know. Shit gets complicated.

I still think AI should get rid of mundane, repetitive, boring tasks. But it shouldn't be eliminating creative, fun asks. It should improve productivity without replacing or reducing the value of the labor of the scientist/artist/physician. But if AI replaced scribes and constructionists in order to make doctors more productive and able to spend more time with patients instead of documenting everything, then that would be the ideal use of this stuff.