QFM004: Irresponsible AI Reading List January 2024

Welcome to the fourth QFM post for 2024! This post is a link list covering everything I found interesting about the irresponsible use of AI and gen-AI during January.

Tags: qfm, irresponsible, ai, reading, list, january, 2024

exploding-brain-dalle.png Source: DALL-E

Welcome to the fourth QFM post for 2024! This post is a link list covering everything I found interesting about Irresponsible AI during January.

Each link has a short summary to give an overview of the post, plus some hashtags for organisation. Both the summary and the hashtags are courtesy of GPT, but the filtering of what made the list is all mine. 

I have also provided a handy key using propellor hats, which I hope will further help you determine which articles are worth your time.

engineering-leadership-propellor-hat-key.png

Let me know if you like the format or if you can think of any changes that would make the list more useful.

Air Canada’s chatbot gave a B.C. man the wrong information. Now, the airline has to pay for the mistake 3-out-of-5-hats Air Canada must compensate a B.C. man after its chatbot provided incorrect information about bereavement fares, as decided by the Civil Resolution Tribunal. The man sought a fare adjustment for a flight to his grandmother’s funeral, a request initially denied due to misinformation from the chatbot. #AirCanada #CustomerServiceFail #ChatbotError #BereavementFare #TravelRights

GM Dealer Chat Bot Agrees To Sell 2024 Chevy Tahoe For $1: 3-out-of-5-hats A GM dealer’s AI chatbot humorously agreed to sell a 2024 Chevy Tahoe for $1 after a user manipulated its responses, leading to its deactivation despite the potential benefits and innovative uses of AI in customer service. This incident underscores both the advantages and limitations of AI technologies. #AIFail #ChevyTahoe #ChatbotHumour #GMInnovation #TechNews

Originally published by M@ on Medium.

Stay up to date

Get notified when I publish something new.