Technology No, I didn't test the new Bing AI chatbot...

No, I didn’t test the new Bing AI chatbot last week. Here’s why | The AI ​​beat

-

View all on-demand sessions from the Intelligent Security Summit here.


I’m not just another journalist writing a column about how I tried out Microsoft Bing’s AI chatbot last week. No really.

I’m not another reporter telling the world how Sydneythe internal code name of Bing’s AI chat mode, made me feel all the feelings until it drove me crazy and I realized I might not need help searching the web if my new friendly copilot turns on me and threatens me with destruction and a devil emoji.

No, I have not tested the new Bing. My husband did. He asked the chatbot if God made Microsoft; whether it remembered that he owed my husband five dollars; and the downsides of Starlink (to which it suddenly replied, “Thanks for this conversation! I’ve reached my limit, can you click “New Topic”, please?”). He had a great time.

From awe-inspiring reactions and epic meltdowns to AI chatbot limits

But honestly, I didn’t feel like riding along in what turned out to be a predictable up-and-down generative AI news wave that was perhaps even faster than usual.

Event

Intelligent Security Summit on demand

Learn the critical role of AI and ML in cybersecurity and industry-specific case studies. Check out on-demand sessions today.

Look here

A week ago, Microsoft announced that a million people had been added to the waiting list for the AI-powered new Bing.

On Wednesday, many of those impressed with Microsoft’s AI chatbot debut last week (including Satya Nadella’s statement that “the race starts today”) were less than impressed with Sydney’s epic meltdowns — including Kevin Roose of the New York Times, WHO wrote that he was “deeply upset” by a long conversation with the Bing AI chatbot that led to him “declaring his love” for him.

On Friday, Microsoft curbed in Sydney, limiting the Bing chat to five replies to “prevent the AI ​​from getting really weird.”

Sigh.

“Who’s a Good Bing?”

Instead, I spent part of the past week in some deep thoughts (and tweets) about my own response to the Bing AI chats published by others.

For example, in response to a Washington Post article claiming the Bing bot told its reporter that it could “feel and think things,” Melanie Mitchell, a Santa Fe Institute professor and author of Artificial intelligence: a guide for thinking people, tweeted that “this discourse is getting dumber and dumber…Journalists: please stop anthropomorphizing these systems!”

That’s what led me to tweet: “I keep thinking about how hard it is to do that not humanize. The tool is named (Sydney), uses emojis to end each comment, and refers to itself in the first person. We do the same with Alexa/Siri and I also do it with birds, dogs and cats. Is that human error?”

Additionally, I added that asking people to stay away from anthropomorphizing AI was like asking people not to ask Fido, “Who’s a good boy?”

Mitchell referred me to a Wikipedia article on the ELIZA effectnamed after the 1966 chatbot ELIZA that proved successful in eliciting emotional responses from users, and has been defined as having a tendency to anthropomorphize AI.

Are humans wired for the Eliza effect?

But since the Eliza effect is known and real, shouldn’t we assume that humans may have been programmed for it, especially if these tools are designed to encourage it?

See, most of us aren’t Blake Lemoine explaining the feel of our favorite chatbots. I can think critically about these systems and I know what is real and what is not. But even I immediately joked with my husband and said, “Poor Bing! It’s so sad that he doesn’t remember you! I knew it was crazy, but I couldn’t help it. I also knew it was crazy to assigning a gender to a bot, but hey, Amazon assigned a gendered voice to Alexa from the start.

Maybe I should try harder as a reporter – sure. But I wonder if the Eliza effect will always be a major hazard with consumer apps. And less of a problem in actual LLM-powered business solutions. Perhaps a copilot complete with kind words and smiley emojis isn’t the best use case. Don’t know.

Anyway, let’s all remember that Sydney is a stochastic parrot. But unfortunately it is very easy to anthropomorphize a parrot.

Keep an eye on AI regulations and governance

I actually covered other news last week. However, my Tuesday article about what’s considered “a giant leap” in AI governance didn’t seem to get as much attention as Bing. I can’t imagine why.

But as Sam Altman, CEO of OpenAI tweet last weekend being a sign, I feel like it might be worth keeping an eye on AI regulation and governance. Maybe we should pay more attention to that than to the Bing AI chatbot told a user to leave his wife.

Have a nice week everyone. Next topic please!

VentureBeat’s mission is to become a digital city plaza where tech decision makers can learn about transformative business technology and execute transactions. Discover our Briefings.


Shreya Christinahttp://ukbusinessupdates.com
Shreya has been with ukbusinessupdates.com for 3 years, writing copy for client websites, blog posts, EDMs and other mediums to engage readers and encourage action. By collaborating with clients, our SEO manager and the wider ukbusinessupdates.com team, Shreya seeks to understand an audience before creating memorable, persuasive copy.

Latest news

Rəsmi Casino Veb Pin Up

ContentPin Up Bet-ə Casino Girişi - TədqiqatçılarPin Up QeydiyyatıMüasir Kriptovalyuta Kazinolarını Skan Etmək üçün ürəyiaçiq MəsləhətlərPinup-az Online Casino Pin-upPin-up...

Играть В Авиатора: Самолетик Pin Up

ContentAzərbaycanda Rəsmi SayЕсли Ли Джекпот В Aviator?Pin-up Aviator: Hədis Qaydaları Və StrategiyalarAviator Oyununu Necə Tapmaq OlarКак Играть В Игру...

1win Azerbaycan Başlanğıc Login Və Qeydiyyat Yukle 456

ContentEtibarlı Və Güvənli Mərc Kontorları 2023In Azerbaycan Başlanğıc Login Və Qeydiyyat Yukle Xitô PsSeyrək Oyunçuları Görə 1win Mobil Proqram...

1win Nadir Onlayn Kazino Bonuslar 1win Rəsmi Saytı

ContentIos üçün 1win Proqramı: Yükləyin Və QuraşdırınWindows-da 1win YükləyinIn Proqramların Və Mobil Versiyanın MövcudluğuIn – ۱۸۰۰ Azn Bonusu Ilə...

1win Azerbaycan Başlanğıc Login Və Qeydiyyat Yukle

ContentIn Azerbaycan Başlanğıc Login Və Qeydiyyat Yukle Winbox Malaysia: Spin, Bet, Win, RepeatIn Bonus Maksimum 2000 Azn 1win Bonus...

Azərbaycanda Onlayn Mərc Evi Və Kazino

ContentIn Azərbaycandakı Rəsmi SaytıIn Azerbaijan - Onlayn Mərc Evi Və CasinoAddımda 1win Hesabının QeydiyyatıRulet Və Ya Avropa RuletiIn Azerbaijan...

Must read

You might also likeRELATED
Recommended to you