| T.R | Title | User | Personal Name
 | Date | Lines | 
|---|
| 792.1 | can't get there from here | MATRIX::ROTH | May you live in interesting times | Wed Nov 25 1987 06:19 | 27 | 
|  |     It is only in the cases where the error is a (convex) quadratic form that
    the solution comes out with simple linear algebra.  For more general
    problems the solution involves nonlinear equations.  Moreover, you
    can't even guarantee that there is a minimum unless the sign of the
    determinant of the Hessian matrix (of second partial derivatives) is 
    checked.
    Also, such nonlinear equation fitting tends to be ill conditioned.
    Qualitatively b has much more influence than a when |x| >> 1, but much
    less when |x| << 1 - noise in the data would tend to send the solution
    all over the place...
    Try fitting against log(y) instead, which is back to fitting a linear
    equation.
	e(a,b) = log(y) - log(a) - b*log(x)
	e(c,b) = log(y) - c - b*log(x), c = log(a)
    Now,
	d e^2 / dc = -2*(log(y) - c - b*log(x))
	d e^2 / db = -2*(log(y) - c - b*log(x))*log(x)
    which is a set of linear equations...
    - Jim
 | 
| 792.2 | Easy As Falling Off A Log? | COMET::ROBERTS | Peace .XOR. Freedom ? | Wed Nov 25 1987 10:53 | 24 | 
|  |     I'd use the log curve fit, but it's necessary to use power curve.
    You did spur my imagination some, though.  What if I took the log
    of both sides?  Then, instead of
    
    ^      b
    y = a*x
    
    
    I'd have
    
       ^
    ln(y) = ln(a) + b*ln(x)
    
    which is linear.  My hasty calculations tell me that the system
    of equations would then be
    
    | n         S(ln(x))  |   | ln(a) |   | S(ln(y))       |
    |                     | * |       | = |                |
    | S(ln(x))  S(ln�(x)) |   | b     |   | S(ln(x)*ln(y)) |
    
    Is this on track?
    
    						/Dwayne
    
 | 
| 792.3 |  | CLT::GILBERT | Builder | Wed Nov 25 1987 15:50 | 3 | 
|  |                                               b
    That doesn't guarantee that Z = S((y - a*x )�) is minimized,
    but it should still give a good fit.
 | 
| 792.4 | log-log | BLITZN::ROBERTS | Peace .XOR. Freedom ? | Wed Nov 25 1987 16:36 | 23 | 
|  | >    < Note 792.3 by CLT::GILBERT "Builder" >
>
>
>                                              b
>    That doesn't guarantee that Z = S((y - a*x )�) is minimized,
>    but it should still give a good fit.
    Right.  Instead, Z would be such that
    
    Z = S((ln(y) - (ln(a)+b*ln(x))�)    is minimized.  Since this is the
    same as
    
                        b
    Z = S( ln�( y / (a*x ) ) ) 
    
    then instead of minimizing the sum of the squares of the differences,
    we're minimizing the sum of the squares of the natural logarithms of
    the quotients.  Interesting. I think if I display the data and
    regression curve on a graph, I'll use log-log paper.
    
    						/Dwayne
     
     
 |