This is my first post in this forum.
I have developed a trendline for a data series (2 items X & Y) with the intention of using in a program.
I wish to get the value for a given X values.
The data series is :
x & f(x)
400 0.000250
500 0.000180
600 0.000125
700 0.000088
800 0.000070
900 0.000060
1000 0.000048
1500 0.000028
2000 0.000015
The equation is :
y = 5.5128E-08x5 - 1.3692E-06x4 + 1.1649E-05x3 - 3.4550E-05x2 - 2.8124E-05x + 3.0217E-04
For testing purpose, when I give the x value of 800, I expect to get the Y value equal to 0.00007 whereas I get a value equal to 19256300.7763.
Why is there such a big difference?
Request help.
Regards
I have developed a trendline for a data series (2 items X & Y) with the intention of using in a program.
I wish to get the value for a given X values.
The data series is :
x & f(x)
400 0.000250
500 0.000180
600 0.000125
700 0.000088
800 0.000070
900 0.000060
1000 0.000048
1500 0.000028
2000 0.000015
The equation is :
y = 5.5128E-08x5 - 1.3692E-06x4 + 1.1649E-05x3 - 3.4550E-05x2 - 2.8124E-05x + 3.0217E-04
For testing purpose, when I give the x value of 800, I expect to get the Y value equal to 0.00007 whereas I get a value equal to 19256300.7763.
Why is there such a big difference?
Request help.
Regards