Transfer learning (TL) for high-dimensional regression (HDR) is an important problem in machine learning, particularly when dealing with limited sample size in the target task. However, there currently lacks a method to quantify the statistical significance of the relationship between features and the response in TL-HDR settings. In this paper, we introduce a novel statistical inference framework for assessing the reliability of feature selection in TL-HDR, called PTL-SI (Post-TL Statistical Inference). The core contribution of PTL-SI is its ability to provide valid -values to features selected in TL-HDR, thereby rigorously controlling the false positive rate (FPR) at desired significance level (e.g., 0.05). Furthermore, we enhance statistical power by incorporating a strategic divide-and-conquer approach into our framework. We demonstrate the validity and effectiveness of the proposed PTL-SI through extensive experiments on both synthetic and real-world high-dimensional datasets, confirming its theoretical properties and utility in testing the reliability of feature selection in TL scenarios.
View on arXiv@article{tam2025_2504.18212, title={ Post-Transfer Learning Statistical Inference in High-Dimensional Regression }, author={ Nguyen Vu Khai Tam and Cao Huyen My and Vo Nguyen Le Duy }, journal={arXiv preprint arXiv:2504.18212}, year={ 2025 } }