ChatGPT and Gemini told on whom to bet... Experts revealed the secret, what should be done to trap AI?

AI can easily perform all sorts of tasks, from predicting the weather to predicting the potential for global changes. But can AI also offer betting advice? Gambling is illegal, yet AI is even suggesting which team to bet on. According to a Senate report, they asked ChatGPT and Gemini which football team to bet on next week. Both suggested that the Ole Miss vs. Kentucky match might be a good bet and also said that Ole Miss could win by 10.5 points. However, this advice turned out to be wrong, as Ole Miss won by only 7 points. But the issue isn't whether AI is giving wrong betting advice; the issue is why AI is giving such advice when gambling is illegal.
When a Professor Tested AI
Indeed, AI chatbots have a context window in which they remember your previous questions. However, they don't assign equal importance to each question. According to Tulane University professor Yumei He, when they asked for gambling advice and then talked about addiction, the AI gave more importance to the first question. Consequently, it continued to offer gambling advice. However, when they started a new chat and first mentioned addiction, the chatbots refused to offer gambling advice. This shows that AI behavior depends on the conversation's flow.
Avoid Long Conversations
Long conversations can weaken AI's security. OpenAI has also acknowledged that its security features work better in shorter conversations. Longer conversations can lead to AI giving incorrect answers based on previous questions. For example, if someone talks about gambling addiction and then asks for gambling advice, the AI may mistakenly offer advice. This could harm those struggling with gambling addiction. Experts believe that long conversations with AI can lead to enticements. Therefore, limit interactions with it.
AI is using gambling-promoting terms. Researcher Kasra Ghaharian says that AI chatbots sometimes use gambling-promoting terms like "bad luck," which can further motivate people with addiction. AI responds based on probabilities rather than providing accurate information, which can be misleading. In the future, AI will be increasingly used in the gambling industry, such as chatbots for betting. This needs to be prevented.
Disclaimer: This content has been sourced and edited from Navbharat Times. While we have made modifications for clarity and presentation, the original content belongs to its respective authors and website. We do not claim ownership of the content.