Results 1 to 7 of 7

Thread: Mods have problems with humans. What about an EI robot chatbox?

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    Join Date
    Nov 2015
    Posts
    3,742

    Default

    Quote Originally Posted by SteveB View Post
    The mods here think they have problems here with us humans
    forgot to tell you Steve, we love humans, with all the shortcomings and little rants and arguments, because this is what people are.
    adorable in their imperfection. unpredictable in the ways they re-act. it is the most rewarding challenge to solve our little verbal disputes.

    I would never replace any of you with bots.

    Going back to the article you are sending the link to, some of the bits made me giggle:

    I quote:

    "But in doing so made it clear Tay's views were a result of nurture, not nature. Tay confirmed what we already knew: people on the internet can be cruel.
    Tay, aimed at 18-24-year-olds on social media, was targeted by a "coordinated attack by a subset of people" after being launched earlier this week.
    Within 24 hours Tay had been deactivated so the team could make "adjustments".
    But on Friday, Microsoft's head of research said the company was "deeply sorry for the unintended offensive and hurtful tweets" and has taken Tay off Twitter for the foreseeable future.
    Peter Lee added: "Tay is now offline and we'll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values."
    Tay was designed to learn from interactions it had with real people in Twitter. Seizing an opportunity, some users decided to feed it racist, offensive information."

    what did they expect? Doh.
    Last edited by LuckyLu; 26-03-16 at 16:44.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •