Changeset 46
- Timestamp:
- Dec 18, 2009, 1:43:33 AM (15 years ago)
- Location:
- liacs/nc/low-correlation
- Files:
-
- 1 added
- 6 edited
Legend:
- Unmodified
- Added
- Removed
-
liacs/nc/low-correlation/autocorrelation.m
r44 r46 3 3 %----------------------------------------------------------------------- 4 4 function [f] = autocorrelation(pop) 5 % Given a population of binary sequences, this function calculates the merit6 % function according to the formula specified in the exercise description. The7 % input pop is the given matrix.8 % Theoutput f is the merit factor calculated (row vector).5 % Given a population of binary sequences, this function calculates 6 % the merit function according to the formula specified in the 7 % exercise description. The input pop is the given matrix. The 8 % output f is the merit factor calculated (row vector). 9 9 10 10 n = size(pop,1); … … 12 12 E = zeros(1,m); 13 13 14 % Calculated efficiently in a matrix-notation; auxilary matrices - Y1,Y2 - are15 % initialized in every iteration. They are shifted form of the original y16 % vectors. The diagonal of the dot-squared Y2*Y1 matrix is exactly the inner17 % sum of merit function.14 % Calculated efficiently in a matrix-notation; auxilary matrices - 15 % Y1,Y2 - are initialized in every iteration. They are shifted form 16 % of the original y vectors. The diagonal of the dot-squared Y2*Y1 17 % matrix is exactly the inner sum of merit function. 18 18 for k=1:n-1 19 19 Y1=pop(1:n-k,:); -
liacs/nc/low-correlation/mcs-call.m
r43 r46 7 7 % best_20 = [-1,1,1,-1,-1,-1,1,1,-1,1,-1,-1,-1,1,-1,1,1,1,1,1]; 8 8 9 % Basic variables10 iterations = [1:10:200];11 repetitions = 20;12 length = 20;9 %% Basic variables 10 iterations = 1000; 11 length = 222; 12 repetitions = 1; 13 13 14 14 % Plot the stuff 15 fitnesses = []; 16 for iteration = iterations 17 fitness = []; 18 for rep = 1:repetitions 19 fprintf('Iter:%i, Rep:%i\n',iteration,rep); 20 [new_fitness, value] = mcs(length,iteration); 21 fitness = [fitness, new_fitness]; 15 [iteration_history, fitness_history] = mcs(length,iterations); 22 16 23 % Little hack to display output nicely 24 % disp(rot90(value,-1)); 25 end 26 fitnesses = [fitnesses,mean(fitness)]; 27 end 28 29 plot(iterations,fitnesses); 17 plot(iteration_history,fitness_history); 30 18 title(sprintf('Monte-Carlo Search on Low-Corretation set - repetitions %i',repetitions)); 31 19 ylabel('fitness'); … … 33 21 grid on; 34 22 legend(sprintf('Length %i',length)); 35 print('mcs-fitness.eps','-depsc2'); 23 print(sprintf('mcs-fitness-%f.eps',max(fitness_history)),'-depsc2'); 24 max(fitness_history) 36 25 -
liacs/nc/low-correlation/mcs.m
r43 r46 8 8 % autocorrelation(best_20); 9 9 10 function [fitness,value] = mcs(length, iterations) 10 function [iteration_history,fitness_history] = mcs(length, iterations) 11 iteration_history = []; 12 fitness_history = []; 13 11 14 best_fitness = 0; 12 for i = 1:iterations15 for iteration = 1:iterations 13 16 % Generate a random column s={-1,1}^n 14 17 n = length; … … 23 26 best_fitness = fitness; 24 27 end 28 iteration_history = [ iteration_history, iteration ]; 29 fitness_history = [ fitness_history, best_fitness ]; 25 30 end 26 fitness = best_fitness;27 value = best_value;28 31 end -
liacs/nc/low-correlation/report.tex
r45 r46 17 17 \usepackage{float} 18 18 \usepackage{color} 19 \usepackage{subfig} 19 20 \floatstyle{ruled} 20 21 \newfloat{result}{thp}{lop} … … 73 74 those numbers are not found. 74 75 75 \begin{center}76 76 \begin{table} 77 77 \caption{Best known values of low-autocorrelation problem} 78 \begin{tabular}{| l | c | r | } 79 \hline 80 $n$ & Best known $f$ \\ 81 \hline \hline 82 20 & 7.6923 \\ \hline 83 50 & 8.1699 \\ \hline 84 100 & 8.6505 \\ \hline 85 200 & 7.4738 \\ \hline 86 222 & 7.0426 \\ \hline 87 \end{tabular} 78 \begin{center} 79 \begin{tabular}{| l | c | r | } 80 \hline 81 $n$ & Best known $f$ \\ 82 \hline \hline 83 20 & 7.6923 \\ \hline 84 50 & 8.1699 \\ \hline 85 100 & 8.6505 \\ \hline 86 200 & 7.4738 \\ \hline 87 222 & 7.0426 \\ \hline 88 \end{tabular} 89 \end{center} 88 90 \label{tab:best} 89 91 \end{table} 90 \end{center}91 92 92 93 \section{Approach} … … 111 112 \footnote{http://www.mathworks.com/products/matlab/}. There are small minor 112 113 differences between them, but all code is made compatible to to run on both 113 systems. 114 systems. The code is to be found in Appendix~\ref{app:code}. 114 115 115 116 As work is done remotely, the following commands are used: 116 117 \unixcmd{matlab-bin -nojvm -nodesktop -nosplash -nodisplay < \%\%PROGRAM\%\%} 117 \unixcmd{octave -q \%\%PROGRAM\%\% 118 \unixcmd{octave -q \%\%PROGRAM\%\%} 118 119 119 120 \section{Results} 120 All experiments are run 20 times the best solution is choosen 121 All experiments are run 5 times the best solution is choosen and will be 122 resented at table~\ref{tab:result}. Iteration size is set to 1000. For $n=20$ 123 the best fitness history is shown at figure~\ref{fig:fitness}. 124 125 \begin{table} 126 \caption{Best known values of low-autocorrelation problem} 127 \begin{center} 128 \begin{tabular}{| r | r | r | r | } 129 \hline 130 $n$ & \multicolumn{3}{|c|}{Best fitness} \\ 131 & known & MCS & SA \\ 132 \hline \hline 133 20 & 7.6923 & 4.3478 & 4.7619 \\ \hline 134 50 & 8.1699 & 2.4752 & 2.6882 \\ \hline 135 100 & 8.6505 & 1.8342 & 1.7470 \\ \hline 136 200 & 7.4738 & 1.8678 & 1.5733 \\ \hline 137 222 & 7.0426 & 1.5657 & 1.4493 \\ \hline 138 \end{tabular} 139 \label{tab:result} 140 \end{center} 141 \end{table} 142 143 \begin{figure}[htp] 144 \begin{center} 145 \subfloat[SA]{\label{fig:edge-a}\includegraphics[scale=0.4]{sa-fitness.eps}} 146 \subfloat[MCS]{\label{fig:edge-b}\includegraphics[scale=0.4]{mcs-fitness.eps}} 147 \end{center} 148 \caption{Fitness throughout the iterations} 149 \label{fig:fitness} 150 \end{figure} 121 151 122 152 \section{Conclusions} 123 \newpage 153 Looking at the graphs the fitness was still increasing so a larger iteration 154 size would make the fitness a better result. Secondly the \emph{SA} is 155 preforming much worse on large number than the \emph{MCS} one. It seems to the 156 temperature function is not working as expected. 157 158 Both algoritms are preforming much worse then the best found solutions. 124 159 \section{Appendix 1} 160 \label{app:code} 125 161 \include{autocorrelation.m} 126 162 \include{initseq.m} -
liacs/nc/low-correlation/sa-call.m
r45 r46 6 6 %% Basic variables 7 7 iterations = 10000; 8 length = 2 0;8 length = 222; 9 9 10 10 … … 21 21 grid on; 22 22 legend(sprintf('Length %i',length)); 23 print('sa-fitness.eps','-depsc2'); 23 print(sprintf('sa-fitness-%f.eps',max(fitness_history)),'-depsc2'); 24 max(fitness_history)
Note:
See TracChangeset
for help on using the changeset viewer.