I know we’re only two days into 2024, but I have been very adamant about needing more than just resolutions for the new year. We need a whole revolution. One of my revolutionary ideas for 2024 is to stop putting stock in company higher-ups. It only took a few hours before Square Enix immediately took a bad path. Then again, this is the company that’ll look at a successful game (from the west) and say it underperformed.
The president of Square Enix, Takashi Kiryu, has apparently done this for a while. On New Years’, he’ll publish a letter laying out the direction he wants to take the company that year. Previous years, he’s championed NFTs, the blockchain, and the metaverse. This year, AI is the new belle of the ball. He states that Square Enix plans to “be aggressive in applying AI and other cutting-edge technologies to both our content development and our publishing functions.”
Square Enix and AI
So, let’s get into Square’s new fascination. President Kiryu started off by talking about the “potential implications” of AI. He acknowledges that it has been the subject of “academic debate” for some time now. But, screw all that noise! Do you see how well it’s doing for OpenAI and ChatGPT?! To him, it has the “potential not only to reshape what we create, but also to fundamentally change the processes by which we create, including programming.”
In the short term, our goal will be to enhance our development productivity and achieve greater sophistication in our marketing efforts. In the longer term, we hope to leverage those technologies to create new forms of content for consumers, as we believe that technological innovation represents business opportunities. “
Takashi Kiryu, President, Square Enix
While I can see some benefits to AI, when it comes to heads of companies talk about it, all I see are money signs in their eyes. AI can help cut corners when it comes to generating content. It could see less writers being hired. They are now being replaced with people who know how to ask ChatGPT the right question to get a quest line. Why hire artists when AI can generate whole levels with just a prompt? And with AI tools built into services like GitHub, why hire developers when you can take code from other open source tools?
If these all the companies involved in the AI pipeline told people what was being used to create the works being generated, maybe things would be different. OpenAI should have began using non-copyrighted materials to train models. Companies should actually use these tools to supplement instead of replace talent.
But none of this happened and I don’t believe it ever will.