Don't Sit On Your Ideas…

Building Web or Mobile Apps?


Hire Vetted On Demand Web and Mobile Development Teams On The VenturePact Marketplace.

Post Your First Project Today!

Get Personalized Updates


Google AI Translator Creates Its Own Language for Translation

Randy RayessRandy Rayess

Often, when we talk about an artificially intelligent system, it is taken for granted that the system would be able to take input and produce output by way of human communications especially with its fluency with the language. When we take this concept ahead to artificially intelligent translators, we expect the system to decode the language and understand its actual structural semantics and retain the true meaning in the translated language.

Well, Google’s AI translator did just that. Back in September this year, Google launched its Neural Machine Translation system. The system uses deep learning to translate languages. This empowers the system to produce better and more naturally phrase-based translations, GNMT parses the sentence as a whole, while keeping track of the smaller pieces like words and phrases.

The creators first taught GNMTS to translate English into Korean and Japanese. As an incremental step to this, they were curious if GNMTS could actually translate Korean into Japanese without using English at all. And it did! The system could make reasonably accurate translations between the two languages without using English a bridge.

But the question arises, how could it do that? With its deep learning, GNMTS was able to understand connections between words and concepts it had previously encountered. But would that mean that the system had developed its own internal language to actually link the shared meaning of those words at a deeper level beyond just words and phrases and thus establishing a relation between the sentences and phrases of two different languages?

Based on how various sentences are related to one another in the memory space of the neural network, Google’s language and AI experts certainly think so.

It’s complete working was published in a paper on ARVIX by researchers. According to the paper, this “interlingua” exists at a deeper level where GNMTS understands the similarities in representation of a sentence or word in all three languages. But its inner processes of complex neural networks still remain unclear.

All in all, the creation of an original language by a system by itself to aid in its deeper understanding of languages, its concepts and semantics, is a great achievement for deep neural networks. Though there are still unsolved parts of this experiment that will warrant deeper investigation into its working, but the fact remains that GNMTS has become a champion of machine learning and neural networks and will inspire many more innovations with its technique and applications.

CoFounder at VenturePact Passionate about software, marketplace startups & remote work. Previously at SilverLake Partners, Ampush and Wharton.