Yuxing believes firmly in the saying "With four parameters I can fit an elephant, and with five I can make him wiggle his trunk" by Enrico Fermi. [1] So, he devotes his entire though short research career to tuning five hypermeters of a model and is convinced that he will finally make it outperform any SOTA models like GPT-3 and AlphaFold 2.
Yuxing is not a computer scientist, a physicist or a chemist, but a HYPERPARAMETER TUNING SCIENTIST. "That makes a lot of difference. Hyperparameter tuning is the technique that changes our life, especially before pulishing a paper.", he said.
[1] Dyson, F. A meeting with Enrico Fermi. Nature 427, 297 (2004).
From: 08 October 2025 - To: 15 October 2025
Python 11 hrs 24 mins ███████████████████████▓░ 94.35 %
YAML 38 mins █▒░░░░░░░░░░░░░░░░░░░░░░░ 05.32 %
TOML 1 min ░░░░░░░░░░░░░░░░░░░░░░░░░ 00.28 %
Other 0 secs ░░░░░░░░░░░░░░░░░░░░░░░░░ 00.05 %
Text 0 secs ░░░░░░░░░░░░░░░░░░░░░░░░░ 00.01 %