Facebook has developed a new neural transcompiler system, Transcoder, to make it easier to migrate codebases to other languages.
Transcoder uses self-supervised training, which Facebook explained is important for translating between programming languages. According to the company, traditional supervised-learning approaches are dependent on large-scale parallel data sets for the languages, but these don’t exist for all languages. For example, there aren’t any parallel data sets from COBOL to C++ or C++ to Python.
Transcoder’s approach only requires source code for one of the languages. It also doesn’t require knowledge of the languages.
Facebook believes Transcoder will be useful for updating legacy codebases. It is also an example of how neural machine translation techniques can be applied to new areas.
Transcoder was developed by researchers Marie-Anne Lachaux, Baptiste Roziere, Lowik Chanussot, and Guillaume Lample. More information on the tool is available in this post.
“Automatic code translation has the potential to make programmers working in companies or on open source projects more efficient by allowing them to integrate various codes more easily from other teams within the company or other open source projects. It can also greatly reduce the effort and expense of updating an old codebase written in an ancient language,” the researchers who created Transcoder wrote.
Progress released new troubleshooting solution, Fiddler Jam
Data Quality: Volume, interdependencies can create big problems
CircleCI webhooks enables dev teams to streamline workflows