I know it’s not even close there yet. It can tell you to kill yourself or to kill a president. But what about when I finish school in like 7 years? Who would pay for a therapist or a psychologist when you can ask for help a floating head on your computer?

You might think this is a stupid and irrational question. “There is no way AI will do psychology well, ever.” But I think in today’s day and age it’s pretty fair to ask when you are deciding about your future.

  • Evilschnuff@feddit.de
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    The term embodiment is kinda loose. My use is the version of AI learning about the world with a body and its capabilities and social implications. What you are saying is outright not possible. We don’t have stable lifelong learning yet. We don’t even have stable humanoid walking, even if Boston dynamics looks advanced. Maybe in the next 20 years but my point stands. Humans are very good at detecting miniscule differences in others and robots won’t get the benefit of „growing up“ in society as one of us. This means that advanced AI won’t be able to connect on the same level, since it doesn’t share the same experiences. Even therapists don’t match every patient. People usually search for a fitting therapist. An AI will be worse.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      We don’t have stable lifelong learning yet

      I covered that with the long term memory structure of an LLM.

      The only problem we’d have is a delay in response on the part of the robot during conversations.

      • Evilschnuff@feddit.de
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        LLMs don’t have live longterm memory learning. They have frozen weights that can be finetuned manually. Everything else is input and feedback tokens. Those work on frozen weights, so there is no longterm learning. This is short term memory only.