ISSN 0253-2778

CN 34-1054/N

Open AccessOpen Access JUSTC Original Paper

A RKHS-based semiparametric approach to nonlinear dimension reduction

Cite this:
https://doi.org/10.3969/j.issn.0253-2778.2016.11.004
  • Received Date: 17 March 2016
  • Accepted Date: 05 June 2016
  • Rev Recd Date: 05 June 2016
  • Publish Date: 30 November 2016
  • A nonlinear dimension reduction method, the generalized semiparametric kernel sliced inverse regression (GSKSIR for short), was proposed, developed based on the theory of reproducing kernel Hilbert Space (RKHS) and the semiparametric method. The method extends the classical semiparametric method into a more generalized semiparametric domain, and is capable of handling infinite dimensional interested a parameter spaces. With this method, both spaces of nuisance parameters and parameters of interests can be infinitely dimensional, the corresponding generalized nuisance tangent space orthogonal complement was derived, estimation equation for the purpose of dimension reduction was constructed, and optimization of the target function could be achieved based on RKHS theory and regularization method, which leads to a nonlinear estimated sufficient reduced dimension subspace with efficient properties. Furthermore, this new method does not impose the linearity design conditions (LDC) required by methods such as the sliced inverse regression (SIR) and the kernel SIR, and so on, and thus, is more general and can be more widely applied. Finally, a Monte Carlo simulation was conducted, and the results demonstrate the excellent finite sample properties of this new method.
    A nonlinear dimension reduction method, the generalized semiparametric kernel sliced inverse regression (GSKSIR for short), was proposed, developed based on the theory of reproducing kernel Hilbert Space (RKHS) and the semiparametric method. The method extends the classical semiparametric method into a more generalized semiparametric domain, and is capable of handling infinite dimensional interested a parameter spaces. With this method, both spaces of nuisance parameters and parameters of interests can be infinitely dimensional, the corresponding generalized nuisance tangent space orthogonal complement was derived, estimation equation for the purpose of dimension reduction was constructed, and optimization of the target function could be achieved based on RKHS theory and regularization method, which leads to a nonlinear estimated sufficient reduced dimension subspace with efficient properties. Furthermore, this new method does not impose the linearity design conditions (LDC) required by methods such as the sliced inverse regression (SIR) and the kernel SIR, and so on, and thus, is more general and can be more widely applied. Finally, a Monte Carlo simulation was conducted, and the results demonstrate the excellent finite sample properties of this new method.
  • loading
  • [1]
    ARONSZAJN N. Theory of reproducing kernels[J]. Transactions of the American Mathematical Society, 1950, 68(3): 337-404.
    [2]
    BICKEL P J, KLAASSEN C A J, RITOV Y, et al. Efficient and Adaptive Estimation for Semiparametric Models[M]. Baltimore, MD: Johns Hopkins University Press, 1993.
    [3]
    COOK R D. Regression Graphics: Ideas for Studying Regressions through Graphics[M]. New York: Wiley, 1998.
    [4]
    COOK R D, LI B. Dimension reduction for conditional mean in regression[J]. The Annals of Statistics, 2002, 30(2): 455-474.
    [5]
    CUI Wenquan, WU Chenglong. An approach to estimating nonlinear sufficient dimension reduction subspace for censored survival data[J]. Journal of University of Science and Technology of China, 2015, 45(9): 709-716.
    [6]
    FERR L, VILLA N. Multilayer perceptron with functional inputs: An inverse regression approach[J]. Scandinavian Journal of Statistics, 2006, 33(4): 807-823.
    [7]
    FUKUMIZU K, BACH F R, JORDAN M I. Kernel dimension reduction in regression[J]. The Annals of Statistics, 2009, 37(4): 1 871- 1 905.
    [8]
    LI L, YIN X. Sliced inverse regression with regularizations[J]. Biometrics, 2008, 64(1): 124-131.
    [9]
    LI K C. Sliced inverse regression for dimension reduction[J]. Journal of the American Statistical Association, 1991, 86(414): 316-327.
    [10]
    MA Y, ZHU L. A review on dimension reduction[J]. International Statistical Review, 2013, 81(1): 134-150.
    [11]
    MA Y, ZHU L. A semiparametric approach to dimension reduction[J]. Journal of the American Statistical Association, 2012, 107(497): 168-179.
    [12]
    SCHLKOPF B, SMOLA A J. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond[M]. Cambridge, MA: MIT Press, 2001.
    [13]
    TSIATIS A A. Semiparametric Theory and Missing Data[M]. New York: Springer, 2006.
    [14]
    WU Q, LIANG F, MUKHERJEE S. Consistency of regularized sliced inverse regression for kernel models[R].Durham, NC: Duke University; IL: University of Illinois Urbana-Champaign, 2008.
    [15]
    WU H M. Kernel sliced inverse regression with applications to classification[J]. Journal of Computational and Graphical Statistics, 2008, 17(3): 590-610.
    [16]
    ZHONG W, ZENG P, MA P, et al. RSIR: regularized sliced inverse regression for motif discovery[J]. Bioinformatics, 2005, 21(22): 4 169-4 175.
    [17]
    COOK R D, WEISBERG S. Discussion of “sliced inverse regression for dimension reduction”[J]. Journal of the American Statistical Association, 1991, 86: 28-33.
    [18]
    LI B, WANG S. On directional regression for dimension reduction[J]. Journal of the American Statistical Association, 2007, 102(479): 997-1 008.
    [19]
    Wu H M. Kernel sliced inverse regression with applications to classification[J]. Journal of Computational and Graphical Statistics, 2008, 17(3): 590-610.
    [20]
    KENJI F, FRANCIS R B, MICHAEL I J. Kernel dimension reduction in regression[J]. The Annals of Statistics, 2009, 37(4): 1 871-1 905.
    [21]
    WU Q, LIANG F, MUKHERJEE S. Kernel sliced inverse regression: regularization and consistency[J]. Abstract and Applied Analysis, 2013: 540725.
  • 加载中

Catalog

    [1]
    ARONSZAJN N. Theory of reproducing kernels[J]. Transactions of the American Mathematical Society, 1950, 68(3): 337-404.
    [2]
    BICKEL P J, KLAASSEN C A J, RITOV Y, et al. Efficient and Adaptive Estimation for Semiparametric Models[M]. Baltimore, MD: Johns Hopkins University Press, 1993.
    [3]
    COOK R D. Regression Graphics: Ideas for Studying Regressions through Graphics[M]. New York: Wiley, 1998.
    [4]
    COOK R D, LI B. Dimension reduction for conditional mean in regression[J]. The Annals of Statistics, 2002, 30(2): 455-474.
    [5]
    CUI Wenquan, WU Chenglong. An approach to estimating nonlinear sufficient dimension reduction subspace for censored survival data[J]. Journal of University of Science and Technology of China, 2015, 45(9): 709-716.
    [6]
    FERR L, VILLA N. Multilayer perceptron with functional inputs: An inverse regression approach[J]. Scandinavian Journal of Statistics, 2006, 33(4): 807-823.
    [7]
    FUKUMIZU K, BACH F R, JORDAN M I. Kernel dimension reduction in regression[J]. The Annals of Statistics, 2009, 37(4): 1 871- 1 905.
    [8]
    LI L, YIN X. Sliced inverse regression with regularizations[J]. Biometrics, 2008, 64(1): 124-131.
    [9]
    LI K C. Sliced inverse regression for dimension reduction[J]. Journal of the American Statistical Association, 1991, 86(414): 316-327.
    [10]
    MA Y, ZHU L. A review on dimension reduction[J]. International Statistical Review, 2013, 81(1): 134-150.
    [11]
    MA Y, ZHU L. A semiparametric approach to dimension reduction[J]. Journal of the American Statistical Association, 2012, 107(497): 168-179.
    [12]
    SCHLKOPF B, SMOLA A J. Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond[M]. Cambridge, MA: MIT Press, 2001.
    [13]
    TSIATIS A A. Semiparametric Theory and Missing Data[M]. New York: Springer, 2006.
    [14]
    WU Q, LIANG F, MUKHERJEE S. Consistency of regularized sliced inverse regression for kernel models[R].Durham, NC: Duke University; IL: University of Illinois Urbana-Champaign, 2008.
    [15]
    WU H M. Kernel sliced inverse regression with applications to classification[J]. Journal of Computational and Graphical Statistics, 2008, 17(3): 590-610.
    [16]
    ZHONG W, ZENG P, MA P, et al. RSIR: regularized sliced inverse regression for motif discovery[J]. Bioinformatics, 2005, 21(22): 4 169-4 175.
    [17]
    COOK R D, WEISBERG S. Discussion of “sliced inverse regression for dimension reduction”[J]. Journal of the American Statistical Association, 1991, 86: 28-33.
    [18]
    LI B, WANG S. On directional regression for dimension reduction[J]. Journal of the American Statistical Association, 2007, 102(479): 997-1 008.
    [19]
    Wu H M. Kernel sliced inverse regression with applications to classification[J]. Journal of Computational and Graphical Statistics, 2008, 17(3): 590-610.
    [20]
    KENJI F, FRANCIS R B, MICHAEL I J. Kernel dimension reduction in regression[J]. The Annals of Statistics, 2009, 37(4): 1 871-1 905.
    [21]
    WU Q, LIANG F, MUKHERJEE S. Kernel sliced inverse regression: regularization and consistency[J]. Abstract and Applied Analysis, 2013: 540725.

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return