This repository was archived by the owner on Nov 1, 2024. It is now read-only.

Description
Hi, appreciate your greet work for code translation!
I wonder if you have done ablation study on the data size. Since the unsupervised model needs way much more training data (over 500M funcitons ) than exisiting code PLMs, such as CodeT5(8.35M funcitons).
How's the performance of TransCoder if less data are provided?