Ghibli trend gone wrong: Can you spot errors in these Ghibli-style pictures?
What began as a fun experiment with OpenAI's latest image generation tool, which lets users transform their photos into dreamy Ghibli-style illustrations, has occasionally led to awkward and unexpected results when the AI misinterprets the details.
Updated On - 2 April 2025, 02:20 PM
Hyderabad: The viral Studio Ghibli AI trend took an unexpected twist and turned creepy enough to amuse and unnerve social media at the same time. What started as an all-fun activity using OpenAI’s newest image generation tool allowed users to turn their photos into dreamy Ghibli-style illustrations. More awkward moments are just created when the AI misinterprets the brief.
One such incident was a Ghibli-style drawing of women celebrating Chhath Puja, a major festival in Bihar. The original photo had women standing by a river holding baskets filled with fruits, incense sticks, and coconuts meant for offering to the Sun God. ChatGPT inaccurately accentuated this picture, assuming the coconut to be a human head.
The bizarre image shows a woman carrying a basket which looks like it should have contained the coconut; instead, it bore the surreal appearance of a severed human head. The disturbing gaffe went viral in no time, prompting unbelievable reactions. One user commented, “Looks like a horror anime crossover,” while another said, “ChatGPT just turned Chhath Puja into a Halloween special.”
Bizarre gaffes are not completely unheard of with AI-generated images. Missing fingers, levitating faces, spare limbs, and mysteriously appearing extra family members have all been reported by social media users in AI-enhanced images. One such case went viral, where a woman’s Ghibli transformation has her gaining a third hand to hold her ice cream.
While these glitches have been a source of entertainment for many on the internet, they also fortify the argument that AI still needs to develop in understanding complex visual details. As this trend rages on, users are eagerly awaiting the next unintentional spook by the AI.