Changeset 46 for liacs


Ignore:
Timestamp:
Dec 18, 2009, 1:43:33 AM (15 years ago)
Author:
Rick van der Zwet
Message:

Report finished = sleep

Location:
liacs/nc/low-correlation
Files:
1 added
6 edited

Legend:

Unmodified
Added
Removed
  • liacs/nc/low-correlation/autocorrelation.m

    r44 r46  
    33%-----------------------------------------------------------------------
    44function [f] = autocorrelation(pop)
    5 % Given a population of binary sequences, this function calculates the merit
    6 % function according to the formula specified in the exercise description.  The
    7 % input pop is the given matrix.
    8 % The output f is the merit factor calculated (row vector).
     5% Given a population of binary sequences, this function calculates
     6% the merit function according to the formula specified in the
     7% exercise description. The input pop is the given matrix. The
     8% output f is the merit factor calculated (row vector).
    99
    1010n = size(pop,1);
     
    1212E = zeros(1,m);
    1313
    14 % Calculated efficiently in a matrix-notation; auxilary matrices - Y1,Y2 - are
    15 % initialized in every iteration. They are shifted form of the original y
    16 % vectors. The diagonal of the dot-squared Y2*Y1 matrix is exactly the inner
    17 % sum of merit function.
     14% Calculated efficiently in a matrix-notation; auxilary matrices -
     15% Y1,Y2 - are initialized in every iteration. They are shifted form
     16% of the original y vectors. The diagonal of the dot-squared Y2*Y1
     17% matrix is exactly the inner sum of merit function.
    1818for k=1:n-1
    1919    Y1=pop(1:n-k,:);
  • liacs/nc/low-correlation/mcs-call.m

    r43 r46  
    77% best_20 = [-1,1,1,-1,-1,-1,1,1,-1,1,-1,-1,-1,1,-1,1,1,1,1,1];
    88
    9 % Basic variables
    10 iterations = [1:10:200];
    11 repetitions = 20;
    12 length = 20;
     9%% Basic variables
     10iterations = 1000;
     11length = 222;
     12repetitions = 1;
    1313
    1414% Plot the stuff
    15 fitnesses = [];
    16 for iteration = iterations
    17   fitness = [];
    18   for rep = 1:repetitions
    19     fprintf('Iter:%i, Rep:%i\n',iteration,rep);
    20     [new_fitness, value] =  mcs(length,iteration);
    21     fitness = [fitness, new_fitness];
     15[iteration_history, fitness_history] = mcs(length,iterations);
    2216
    23     % Little hack to display output nicely
    24     % disp(rot90(value,-1));
    25   end
    26   fitnesses = [fitnesses,mean(fitness)];
    27 end
    28 
    29 plot(iterations,fitnesses);
     17plot(iteration_history,fitness_history);
    3018title(sprintf('Monte-Carlo Search on Low-Corretation set - repetitions %i',repetitions));
    3119ylabel('fitness');
     
    3321grid on;
    3422legend(sprintf('Length %i',length));
    35 print('mcs-fitness.eps','-depsc2');
     23print(sprintf('mcs-fitness-%f.eps',max(fitness_history)),'-depsc2');
     24max(fitness_history)
    3625
  • liacs/nc/low-correlation/mcs.m

    r43 r46  
    88% autocorrelation(best_20);
    99
    10 function [fitness,value] = mcs(length, iterations)
     10function [iteration_history,fitness_history] = mcs(length, iterations)
     11  iteration_history = [];
     12  fitness_history = [];
     13
    1114  best_fitness = 0;
    12   for i = 1:iterations
     15  for iteration = 1:iterations
    1316    % Generate a random column s={-1,1}^n
    1417    n = length;
     
    2326      best_fitness = fitness;
    2427    end
     28    iteration_history = [ iteration_history, iteration ];
     29    fitness_history = [ fitness_history, best_fitness ];
    2530  end
    26   fitness = best_fitness;
    27   value = best_value;
    2831end
  • liacs/nc/low-correlation/report.tex

    r45 r46  
    1717\usepackage{float}
    1818\usepackage{color}
     19\usepackage{subfig}
    1920\floatstyle{ruled}
    2021\newfloat{result}{thp}{lop}
     
    7374those numbers are not found.
    7475
    75 \begin{center}
    7676\begin{table}
    7777  \caption{Best known values of low-autocorrelation problem}
    78   \begin{tabular}{| l | c | r | }
    79     \hline
    80     $n$ & Best known $f$ \\
    81     \hline \hline         
    82     20  & 7.6923 \\ \hline
    83     50  & 8.1699 \\ \hline
    84     100 & 8.6505 \\ \hline
    85     200 & 7.4738 \\ \hline
    86     222 & 7.0426 \\ \hline
    87   \end{tabular}
     78  \begin{center}
     79    \begin{tabular}{| l | c | r | }
     80      \hline
     81      $n$ & Best known $f$ \\
     82      \hline \hline         
     83      20  & 7.6923 \\ \hline
     84      50  & 8.1699 \\ \hline
     85      100 & 8.6505 \\ \hline
     86      200 & 7.4738 \\ \hline
     87      222 & 7.0426 \\ \hline
     88    \end{tabular}
     89  \end{center}
    8890  \label{tab:best}
    8991\end{table}
    90 \end{center}
    9192
    9293\section{Approach}
     
    111112\footnote{http://www.mathworks.com/products/matlab/}. There are small minor
    112113differences between them, but all code is made compatible to to run on both
    113 systems.
     114systems. The code is to be found in Appendix~\ref{app:code}.
    114115
    115116As work is done remotely, the following commands are used:
    116117\unixcmd{matlab-bin -nojvm -nodesktop -nosplash -nodisplay < \%\%PROGRAM\%\%}
    117 \unixcmd{octave -q \%\%PROGRAM\%\%
     118\unixcmd{octave -q \%\%PROGRAM\%\%}
    118119
    119120\section{Results}
    120 All experiments are run 20 times the best solution is choosen
     121All experiments are run 5 times the best solution is choosen and will be
     122resented at table~\ref{tab:result}. Iteration size is set to 1000. For $n=20$
     123the best fitness history is shown at figure~\ref{fig:fitness}.
     124
     125\begin{table}
     126  \caption{Best known values of low-autocorrelation problem}
     127  \begin{center}
     128    \begin{tabular}{| r | r | r | r | }
     129      \hline
     130      $n$ & \multicolumn{3}{|c|}{Best fitness} \\
     131          & known & MCS & SA \\
     132      \hline \hline         
     133      20  & 7.6923 & 4.3478 & 4.7619 \\ \hline
     134      50  & 8.1699 & 2.4752 & 2.6882 \\ \hline
     135      100 & 8.6505 & 1.8342 & 1.7470 \\ \hline
     136      200 & 7.4738 & 1.8678 & 1.5733 \\ \hline
     137      222 & 7.0426 & 1.5657 & 1.4493 \\ \hline
     138    \end{tabular}
     139    \label{tab:result}
     140  \end{center}
     141\end{table}
     142
     143\begin{figure}[htp]
     144  \begin{center}
     145    \subfloat[SA]{\label{fig:edge-a}\includegraphics[scale=0.4]{sa-fitness.eps}}
     146    \subfloat[MCS]{\label{fig:edge-b}\includegraphics[scale=0.4]{mcs-fitness.eps}}
     147  \end{center}
     148  \caption{Fitness throughout the iterations}
     149  \label{fig:fitness}
     150\end{figure}
    121151
    122152\section{Conclusions}
    123 \newpage
     153Looking at the graphs the fitness was still increasing so a larger iteration
     154size would make the fitness a better result. Secondly the \emph{SA} is
     155preforming much worse on large number than the \emph{MCS} one. It seems to the
     156temperature function is not working as expected.
     157
     158Both algoritms are preforming much worse then the best found solutions.
    124159\section{Appendix 1}
     160\label{app:code}
    125161\include{autocorrelation.m}
    126162\include{initseq.m}
  • liacs/nc/low-correlation/sa-call.m

    r45 r46  
    66%% Basic variables
    77iterations = 10000;
    8 length = 20;
     8length = 222;
    99
    1010
     
    2121grid on;
    2222legend(sprintf('Length %i',length));
    23 print('sa-fitness.eps','-depsc2');
     23print(sprintf('sa-fitness-%f.eps',max(fitness_history)),'-depsc2');
     24max(fitness_history)
Note: See TracChangeset for help on using the changeset viewer.