Game developers face a major challenge: releasing their products worldwide at the same time. It’s no longer feasible to launch in one region, like Japan, and then move to others, such as the US. Simultaneous global releases require games to be translated into multiple languages, adding to production time.
One issue is lip syncing. When characters’ mouth movements don’t match the spoken language, it can look odd. Fixing this can be time-consuming. Sony may have found a solution with a new software system.
Recently, a patent was discovered, called the “language evaluation system,” which assesses the lip movements of non-playable characters (NPCs) in games. This tool helps developers identify and fix flaws in the translation process.
If successful, the software could adjust lip and facial movements to match the spoken language, saving development time. In theory, this tool can make NPC lip movements more accurate and natural when speaking different languages.
According to dexerto, the potential impact on game development could be significant.