First we consider our mathematical model fourth order polynomial;

Let's create the data and add gaussian distrubuted noise to our data, in matlab we just type these commands;
>> x=linspace(0,4,40); % create 40 datapoints between 0 and 4
>> a=-1; b=2; c=-3; d=-1; e=-30; % fit parameters
>> y=a.*x.^4+b.*x.^3+c.*x.^2+d.*x+e;
Now we've created the data. Matlab has a command as "randn" which generates normally distrubuted random numbers:
>> y=y+0.5*sqrt((max(abs(y)))).*randn(1,length(y)); %adds noise

Since our model is linear, it means there is no linear dependency among the parameters, we can expect that the solution would be unique. Marquardt-Levenberg solution of the equation;

A is the jacobian matrix which includes partial derivatives for each fit parameters, dx correction values and dy is the difference between theoretical and the observed data. Lambda is known as damping parameter.
Iterative solution has offered for nonlinear fit so our program will have a cycle which controls the number of iterations. Eeach iteration we will get the parameter correction values and add these values to our guess. Let's see the main structure of the code:
%%%%%%%%%%%%%%%%%%%%
% Author: Mustafa DENIZ %
% Contact: mustafdeniz@itu.edu.tr %
%%%%%%%%%%%%%%%%%%%%%
a=1; b=1; c=1; d=1,e=1; theta=5; %initial guesses
a0=a; b0=b; c0=c; d0=d; e0=e;
itnu=input('iteration number:');
for ii=1:itnu;
y=a.*x.^4+b.*x.^3+c.*x.^2+d.*x+e; %mathematical model
da=x.^4; db=x.^3; dc=x.^2; dd=x; de=ones(1,N) % partial derivatives
U=[da',db',dc',dd',de']; %Jacobian matrix
dy=yo-y; %obeserved-calculated
dp=(U'*U+theta*eye(5))\U'*dy'; %solving the system
a=a+dp(1); b=b+dp(2); c=c+dp(3); d=d+dp(4); e=e+dp(5); %add dp's to parameters
ai(ii)=a; bi(ii)=b; ci(ii)=c; di(ii)=d; ei(ii)=e;
gof(ii)=0.2*sum(sqrt(dy.^2)); %goodness of fit
figure(1)
plot(x,yo,'*')
title('calculated vs observed')
xlabel('x')
ylabel('amplitude')
hold on
plot(x,y,'r')
legend('observed','calculated')
end
%plot Goodness of fit
figure(2)
plot(gof,'-rs','LineWidth',2,'MarkerEdgeColor','g','MarkerFaceColor','b','MarkerSize',5)
xlabel('iteration number')
ylabel('Goodnes of fit')
%plot the parameters by iteration
figure(3)
subplot(3,2,1)
plot(0:itnu,[a0 ai],'-rs','LineWidth',2,'MarkerEdgeColor','g','MarkerFaceColor','b','MarkerSize',5)
title('Changing of a')
xlabel('iteration number')
ylabel('a')
subplot(3,2,2)
plot(0:itnu,[b0 bi],'-rs','LineWidth',2,'MarkerEdgeColor','g','MarkerFaceColor','b','MarkerSize',5)
title('Changing of b')
xlabel('iteration number')
ylabel('b')
subplot(3,2,3)
plot(0:itnu,[c0 ci],'-rs','LineWidth',2,'MarkerEdgeColor','g','MarkerFaceColor','b','MarkerSize',5)
title('Changing of c')
xlabel('iteration number')
ylabel('c')
subplot(3,2,4)
plot(0:itnu,[d0 di],'-rs','LineWidth',2,'MarkerEdgeColor','g','MarkerFaceColor','b','MarkerSize',5)
title('Changing of d')
xlabel('iteration number')
ylabel('d')
subplot(3,2,5)
plot(0:itnu,[e0 ei],'-rs','LineWidth',2,'MarkerEdgeColor','g','MarkerFaceColor','b','MarkerSize',5)
title('Changing of e')
xlabel('iteration number')
ylabel('e')
Even I've started with awful initial guesses solution has came up quickly because of the model parameters are linearly independent.

In an addition I did not weight the data nor the parameters. I assumed that the data is perfect and there is no uncertainty but for real life we know that those things should be there. Next time I will study fitting with SVD and add weighting the both parameters and the data.
super aciklamissiniz.. tebrik ediyorum..
YanıtlaSil