+ Reply to Thread
Results 1 to 1 of 1

Thread: Ridge Regression estimator

  1. #1
    Points: 20,006, Level: 89
    Level completed: 32%, Points required for next Level: 344

    Posts
    568
    Thanks
    50
    Thanked 20 Times in 19 Posts

    Ridge Regression estimator




    Hi!

    I'm learning about the different penalized regression methods and I thought I'd stick with Ridge Regression because an analytical expression can be derived for the estimator. I've seen a few different versions of the derivation, however they all use matrix calculus, something I'm not too familiar with. Therefore, I thought I'd try to derive the estimator using a simple model and ordinary calculus. Was wondering if someone could comment on whether I've done this properly:

    Setup:
    Multiple linear regression model with 2 predictor (continuous) variables: B1, B2
    No intercept in the model (assume centered predictors)

    y = B1*x1 + B2 * x2 + e

    The aim is to minimize the residual sum of squares (RSS) subject to the constraint that the sum of the squared coefficients are 'penalized' by a parameter, Lambda.

    L(B1, B2, lamda) = RSS + lambda * (B1^2 + B2^2)

    L(B1, B2, lamda) = Sum [ y - B1*x1 - B2*x2 ] + lambda * (B1^2 + B2^2)

    if we take the partial derivative of one of the slopes, say B1, I arrive at this as the ridge estimate:

    dL(B1, B2, lamda)/dB1 = -2 Sum [ y - B1*x1 - B2*x2 ] * -x1 + 2*lambda*B1

    If i set this to zero and solve for B1, I get:

    B1 (Ridge) = Sum (y*x1) - B2 * Sum (x1 * x2) / Sum (x2^2) + lambda

    Is anyone able to verify this? TIA
    Last edited by jamesmartinn; 06-13-2017 at 09:54 AM.

+ Reply to Thread

           




Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts






Advertise on Talk Stats