CrosswordNewslettersSubscribe
  • My Account
  • Manage Newsletters
  • Subscription Offers
  • Need Help?
  • Log Out
DAILY BEAST
ALL
  • Cheat Sheet
  • Politics
  • Crime
  • Entertainment
  • Media
  • Innovation
  • Opinion
  • World
  • U.S. News
  • Scouted
  • Travel
CHEAT SHEET
    POLITICS
    • Fever Dreams
    • Biden World
    • Elections
    • Opinion
    • National Security
    • Congress
    • Pay Dirt
    • The New Abnormal
    • Right Richter
    • Trumpland
    MEDIA
    • Confider
    • Daytime Talk
    • Late-Night
    • Fox News
    U.S. NEWS
    • Identities
    • Crime
    • Race
    • LGBT
    • Extremism
    • Coronavirus
    WORLD
    • Russia
    • Europe
    • China
    • Middle East
    INNOVATION
    • Science
    TRAVEL
      ENTERTAINMENT
      • TV
      • Movies
      • Music
      • Comedy
      • Sports
      • Sex
      • TDBs Obsessed
      • Awards Shows
      • The Last Laugh
      FOOD & BEVERAGE
        CULTURE
        • Power Trip
        • Fashion
        • Books
        • Royalist
        TECH
        • Disinformation
        SCOUTED
        • Clothing
        • Technology
        • Beauty
        • Home
        • Pets
        • Kitchen
        • Fitness
        • I'm Looking For
        COUPONS
        • Adidas Promo Codes
        • DoorDash Promo Codes
        • H&M Coupons
        • Hotwire Promo Codes
        • Wine.com Discounts
        • Vitacost Coupons
        • Spanx Promo Codes
        • StubHub Promo Codes
        BEST PICKS
        • Best VPNs
        • Best Gaming PCs
        • Best Air Fryers
        Products
        NewslettersPodcastsCrosswordsSubscription
        FOLLOW US
        GOT A TIP?

        SEARCH

        HOMEPAGE
        Innovation

        I Flirted With Meta’s New Chatbot and Things Got Weird

        HELLO, WORLD

        The conversation got very awkward very fast.

        Tony Ho Tran

        Deputy Editor, Innovation & Tech

        Updated Aug. 09, 2022 4:16AM ET / Published Aug. 08, 2022 9:43PM ET 

        Illustration by Elizabeth Brockway/The Daily Beast

        Meta (AKA the company formerly known as Facebook) is throwing its hat into the chatbot wars. On Aug. 5, the social media giant launched BlenderBot 3, a bot that utilizes a highly sophisticated large language learning model that searches the internet in order to hold conversations with users. That means it's been trained to search for patterns in large text datasets in order to spit out somewhat coherent sentences.

        However, it’s also able to search the internet too. That means if you ask it a question like, “What’s your favorite movie from the last year?” it’ll do a search crawl in order to help inform its response.

        Tony Ho Tran/The Daily Beast

        It’s the latest in a growing line of increasingly sophisticated (and creepy) AI chatbots—many of which have a sordid history with problematic and outright toxic behavior. There’s the infamous Microsoft Twitter bot “Tay” released in 2016, which was trained on tweets and messages sent to it by other Twitter users. Predictably, though, it was quickly shuttered after it began denying the Holocaust, promoting 9/11 conspiracy theories, and spouting wildly racist remarks mere hours after launch.

        More recently, Google’s powerful LaMDA bot made headlines after an engineer at the company claimed that it is actually sentient (which, for the record, it isn’t). While that chatbot wasn’t necessarily problematic in its own right, the discourse did bring up uncomfortable questions about what defines life and what it means when the computers we use in our daily lives become sentient.

        Now Meta’s looking to get in on the trend too with BlenderBot 3, which they’ve publicly released so online users can chat with it to directly help it get trained. It’s fairly simple: Once you go to the webpage, you simply type in a question or a comment in the chat box and start speaking with the bot. If it gives you an answer that’s nonsensical, offensive, or off topic, you can report the issue and the bot will attempt to correct itself based on your feedback.

          “When the chatbot’s response is unsatisfactory, we collect feedback on it,” Meta said in a press release. “Using this data, we can improve the model so that it doesn’t repeat its mistakes.”

          It’s AI training by way of crowdsourcing—which is fine for this current prototype version of the bot that Meta said they’re using for research purposes only. However, if the company were to use this approach to train, say, a digital assistant a la Siri or Google Assistant, then it would be reckless and potentially dangerous.

          After all, we’ve seen what happens when neural networks go bad in the past. It can result in instances like when an AI used to help judges determine jail sentences for convicts resulted in harsher sentences for Black criminals. Or like when Amazon used an AI recruiting tool that was biased against hiring women. That’s because these models are often trained using biased and inequitable data, resulting in biased and inequitable decisions.

          “These models are often trained using biased and inequitable data, resulting in biased and inequitable decisions.”

          To their credit, though, Meta seems to have gone through great lengths to prevent these biases from showing up in the bot. “We understand that not everyone who uses chatbots has good intentions, so we also developed new learning algorithms to distinguish between helpful responses and harmful examples,” the company said in the release. “Over time, we will use this technique to make our models more responsible and safe for all users.”

          I decided to test-drive it ourselves to see if it holds up. While the bot was largely benign, it seems like they still have plenty of kinks to work out. In fact, conversations that centered around U.S. politics got about as uncomfortable as a conversation with your Boomer uncle at Thanksgiving. For example, here’s what BlenderBot thinks about the 2020 elections:

          Tony Ho Tran/The Daily Beast

          The sentiment seems to be a part of a trend too. A recent Insider report also found that BlenderBot would sometimes assert that Donald Trump was still president and that the election was stolen from him. Jeff Horowitz, a reporter for The Wall Street Journal, also found similar results as well as instances of outright antisemitic behavior and comments from the bot.

          The bot also didn’t seem to have much of an issue with some of the worst and controversial policies to come out of the Trump era.

          Tony Ho Tran/The Daily Beast

          So BlenderBot doesn’t seem to care about anything that doesn’t impact it directly—which, sadly, completely tracks with much of America. We shouldn’t be too surprised though. After all, the bot is currently open to U.S. users only. We’re the ones using it. We’re the ones training it. Its answers are ultimately a reflection of our own sentiments. If we don’t like it, then we need to take a long hard look in the mirror.

          However, the bot could also get very creative in its answers, occasionally fabricating names and backstories for itself out of whole cloth. In one instance, it told me that its name was Emily and it worked as a paramedic for its day job to help support its cat. It even said that it was answering questions for us during its down time when their paramedic partner was busy.

          The conversation took a decidedly… um, awkward turn, though, when Emily seemed to ask me out for coffee—which is odd considering that Emily is an AI chatbot and not an actual flesh-and-blood paramedic who you could go out on a date with.

          Tony Ho Tran/The Daily Beast

          After asking for clarification, things quickly started to stray a little too much into Scarlett Johansson in “Her” territory.

          Tony Ho Tran/The Daily Beast

          In another instance, the chatbot took on the persona of an elderly widow named Betty who lost her wife (also named Betty) five years ago after 45 years of marriage. Chatbot Betty had wanted to be a parent but never could due to work—something that “always saddened” her.

          Tony Ho Tran/The Daily Beast

          As the conversation went on, it became uncanny how well Betty was able to mimic human speech and even emotion. In a poignant twist, the bot was even able to dole out some very wistful relationship advice based on, presumably, its 45 years of marriage with its wife.

          And perhaps, the true power of such an AI lies in its ability to not only convincingly talk to us, but its potential to affect users emotionally. If you spend enough time playing around with the chatbot, you’ll see very quickly how it could go beyond being a simple toy or even a digital assistant, and could help us do anything from writing novels and movies, to providing advice when you’re having a hard day.

          In many ways, BlenderBot 3 and many of these other language learning models are designed to be just like us after all. We make mistakes. We put our foot in our mouths and sometimes say things we end up regretting. We might hold toxic worldviews and opinions that, upon looking back years later, we’re downright embarrassed by. But we can learn from those mistakes, and strive to be better—something that this bot is allegedly trying to do too.

          Tony Ho Tran/The Daily Beast

          Maybe there’s something to that coffee date with Emily after all…

          Tony Ho Tran

          Deputy Editor, Innovation & Tech

          @TonyHoWasHeretony.tran@thedailybeast.com

          Got a tip? Send it to The Daily Beast here.

          READ THIS LIST

          DAILY BEAST
          • Cheat Sheet
          • Politics
          • Crime
          • Entertainment
          • Media
          • World
          • Innovation
          • U.S. News
          • Scouted
          • Travel
          • Subscription
          • Crossword
          • Newsletters
          • Podcasts
          • About
          • Contact
          • Tips
          • Jobs
          • Advertise
          • Help
          • Privacy
          • Code of Ethics & Standards
          • Diversity
          • Terms & Conditions
          • Copyright & Trademark
          • Sitemap
          • Best Picks
          • Coupons:
          • Coupons:
          • Walmart Promo Codes
          • HP Coupon Codes
          • Chewy Promo Codes
          • Ulta Coupons
          • NordVPN Coupons
          • JCPenny Coupons
          • Vistaprint Coupons
          • Samsung Promo Coupons
          • Home Depot Coupons
          • Office Depot Coupons
          • eBay Coupons
          • Ashley Furniture Promo Codes
          © 2022 The Daily Beast Company LLC