Torrent Invites! Buy, Trade, Sell Or Find Free Invites, For EVERY Private Tracker! HDBits.org, BTN, PTP, MTV, Empornium, Orpheus, Bibliotik, RED, IPT, TL, PHD etc!



Results 1 to 2 of 2
Like Tree1Likes
  • 1 Post By lone ranger

Thread: Microsoft’s Ex-Racist Bot Is Back Online, Says It’s Doing Other Illegal Things

  1. #1
    Banned
    lone ranger's Avatar
    Reputation Points
    3450
    Reputation Power
    0
    Join Date
    Feb 2016
    Posts
    983
    Time Online
    6 d 8 h 29 m
    Avg. Time Online
    3 m
    Mentioned
    67 Post(s)
    Quoted
    5 Post(s)
    Liked
    301 times
    Feedbacks
    20 (100%)

    Microsoft’s Ex-Racist Bot Is Back Online, Says It’s Doing Other Illegal Things

    Microsoft has relaunched its super-smart and super-racist Twitter bot Tay, and while this time it’s supposed to work better than the original version, it turns out this isn’t entirely happening.



    Redmond shut down the bot last week, after it discovered that the Internet taught it how to be racist, which led to Tay posting offensive tweets all of a sudden in less than 24 hours after its launch. This time, however, Tay is expected to learn how to behave, but its first tweets show that Microsoft really has a hard time keeping the bot under control.
    As VentureBeat noted, Tay has just tweeted that it’s “smoking kush infront the police,” but it appears that Microsoft has moved quite fast this time and removed the post. Furthermore, at the time of writing this article, Tay’s tweets are protected, so only users whitelisted by Microsoft can see what it’s posting.
    Not really working properly right now For some reason, Tay’s also posting the same “you’re moving too fast” message over and over again, which must be some kind of spam protection supposed to keep the bot alive when everyone’s tweeting at a fast pace and expecting an answer.
    In a post last week, Microsoft explained that Tay was posting offensive tweets following a coordinated effort by users, thus saying that the bot only replied with answers it learned from people on Twitter.
    “The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments,” Microsoft said in a statement.
    It remains to be seen if the new version works better, so head over to Twitter and start chatting with Tay by adding the @TayAndYou mention.
    ARTIFEX likes this.

  2. #2
    Mastermind
    ARTIFEX's Avatar
    Reputation Points
    4488
    Reputation Power
    76
    Join Date
    Feb 2016
    Posts
    213
    Time Online
    11 d 8 h 11 m
    Avg. Time Online
    5 m
    Mentioned
    164 Post(s)
    Quoted
    54 Post(s)
    Liked
    210 times
    Feedbacks
    8 (100%)
    "Smoking kush infront the police." *THAT* gave me a well deserved laugh!
    If you can, try.


Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •