Ignore:
Timestamp:
Dec 19, 2009, 5:44:54 PM (15 years ago)
Author:
Rick van der Zwet
Message:

Final result report

Location:
liacs/nc/laser-pulse-shaping
Files:
3 edited

Legend:

Unmodified
Added
Removed
  • liacs/nc/laser-pulse-shaping/latex.mk

    r48 r49  
    5858
    5959%.m.tex: %.m
    60         highlight --include-style --linenumbers --no-doc \
    61           --latex --input $< --output $@
     60        highlight --linenumbers --no-doc --latex --style-outfile highlight.sty \
     61          --style vim --wrap --line-length 80 --line-number-length 3 \
     62          --input $< --output $@
  • liacs/nc/laser-pulse-shaping/pso.m

    r47 r49  
    55
    66% Dimention settings
    7 parameters = 80;
     7parameters = 10;
    88
    99% If global optimum does not change this many steps bail out
     
    1313
    1414% Flock properties
    15 local_swarm_size = 10;
    16 local_swarms = 50;
     15local_swarm_size = 50;
     16local_swarms = 10;
    1717
    1818%% Particle properties
     
    113113end
    114114
     115% Dispay hack
     116g_fitness
    115117g_best
    116118
     
    121123grid on;
    122124legend(sprintf('Parameters %i',parameters));
    123 print(sprintf('pso-fitness-%f.eps', max(fitness_history)),'-depsc2');
     125print(sprintf('pso-fitness-%.10f.eps', max(fitness_history)),'-depsc2');
  • liacs/nc/laser-pulse-shaping/report.tex

    r48 r49  
    1818\usepackage{color}
    1919\usepackage{subfig}
     20\usepackage{marvosym}
    2021\floatstyle{ruled}
    2122\newfloat{result}{thp}{lop}
    2223\floatname{result}{Result}
     24
     25\input{highlight.sty}
    2326
    2427\title{Laser Pulse Shaping problem\\
     
    3841\section{Introduction}
    3942The report is focused on the so-called \emph{laser pulse shaping problem}.
    40 Todays lasers are also used within the range of atoms or mulecule research.
     43Today's lasers are also used within the range of atoms or molecule research.
    4144Using small pulses it is able to align and alter the movement of the atoms.
    4245
     
    4548laser pulse the way it can move the atoms.
    4649
     50To turn and tweak all the 'knobs' at the same time there will be \emph{Particle
     51Swarm Optimizer (PSO)} used to explore the search space. The \emph{PSO} is
     52basically a whole bunch of individual agents which all try to find an optimum
     53into the search space. During this search they get input about other potential
     54bests from the whole swarm (broadcast style) and the neighborhood
     55(observation) and using this values they determine their new location.
     56
    4757\section{Problem description}
     58A laser pulse going through a crystal produces light at the octave of its
     59frequency spectrum. The total energy of the radiated light is proportional to
     60the integrated squared intensity of the primary pulse. Explicitly, the
     61time-dependent profile of the laser field in our simulations is given by:
     62\begin{equation}
     63\label{eq:simulation}
     64E(t) = \int_{-\infty}^\infty A(\omega)exp(i\phi(\omega))exp(i{\omega}t) d\omega,
     65\end{equation}
     66
     67where $A(\omega)$ is a Gaussian window function describing the contribution of
     68different frequencies to the pulse and $\phi(\omega)$, the phase function,
     69equips these frequencies with different complex phases.
     70
    4871To determine the best solution a fitness function is needed, which could be
    4972found in the shape of equation~\ref{eq:fitness}
    5073\begin{equation}
    51 \label{eg:fitness}
    52 SHG = \int_0^T E^4(t)dt \rightlongarrow maximization
     74\label{eq:fitness}
     75SHG = \int_0^T E^4(t)dt \longrightarrow maximization
    5376\end{equation}
    5477
    55 
    56 Due to the huge 'exploding' posibilities it is not possible to walk trough the
    57 whole list of posibities, so we need alternative approches to tackle this
    58 problem. First we will try the \emph{Monte Carlo Search algoritm [MCS]} and next try
    59 \emph{Simulated Annealing [SA]}.
    60 
    61 \emph{MCS} is all about random sequence generation and trying to find a good
    62 solution, do a small ajustment on the solution and compare the new solution
    63 again.
    64 
    65 \emph{SA} takes a more and less the same aproch, but it also accept small
    66 'losses' from time-to-time. But as time passes it get less likely to accept
    67 'bad solutions'.
    68 
     78Note that $0 < SHG < 1$
    6979\section{Statistics}
    70 Of course many people have run numerous computer hours on finding the best
    71 possible fitness as shown in table~\ref{tab:best}. The algoritms used to find
    72 those numbers are not found.
    73 
    74 \begin{table}
    75   \caption{Best known values of low-autocorrelation problem}
    76   \begin{center}
    77     \begin{tabular}{| l | c | r | }
    78       \hline
    79       $n$ & Best known $f$ \\
    80       \hline \hline         
    81       20  & 7.6923 \\ \hline
    82       50  & 8.1699 \\ \hline
    83       100 & 8.6505 \\ \hline
    84       200 & 7.4738 \\ \hline
    85       222 & 7.0426 \\ \hline
    86     \end{tabular}
    87   \end{center}
    88   \label{tab:best}
    89 \end{table}
    9080
    9181\section{Approach}
    92 The \emph{MCS} is implemented streight-forward from the wikipedia
    93 page\footnote{http://en.wikipedia.org/wiki/Monte\_Carlo\_method}. The mutation
    94 choosing is flipping one bit in the array.
    95 
    96 For the \emph{SA} implemented also comes streight from 'the book'
    97 \footnote{http://en.wikipedia.org/wiki/Simulated\_annealing}, the process
    98 choosing for the cooling-down sequence is taken from the dutch wikipedia page
    99 'Simulated annealing', which is nice indicator when to choose something when
    100 the solution is worse (logically, beter solutions will always be accepted).
    101 \footnote{http://nl.wikipedia.org/wiki/Simulated\_annealing}:
    102 \begin{math}
    103 p = e^{\frac{f(i)-f(j)}{c}}
    104 \end{math}
     82The Wikipedia page 'Particle swarm
     83optimization'~\footnote{http://en.wikipedia.org/wiki/Particle\_swarm\_optimization}
     84contained a reference implementation used to implement the algorithm. The nice
     85part about the algorithm is its flexibility in tuning. As within the \emph{PSO}
     86there are many 'knobs' which could be tweaked as well, like likeliness of
     87heading for the global optimum, neighborhood optimum and local optimum.
    10588
    10689\section{Implementation}
     
    11699\unixcmd{octave -q \%\%PROGRAM\%\%}
    117100
     101The flock is represented into a 3d block. A slice of that block contains a
     102local swarm, a column in slice is a individual particle.
     103
    118104\section{Results}
    119 All experiments are run 5 times the best solution is choosen and will be
    120 resented at table~\ref{tab:result}. Iteration size is set to 1000. For $n=20$
    121 the best fitness history is shown at figure~\ref{fig:fitness}.
     105The program is run against a parameter set of 80. The algorithm is allowed to run
     106for 10 minutes with a maximum of 1000 iterations. If there are no improvements
     107after 5 iterations then it will bail out as well.
    122108
    123 \begin{table}
    124   \caption{Best known values of low-autocorrelation problem}
    125   \begin{center}
    126     \begin{tabular}{| r | r | r | r | }
    127       \hline
    128       $n$ & \multicolumn{3}{|c|}{Best fitness} \\
    129           & known & MCS & SA \\
    130       \hline \hline         
    131       20  & 7.6923 & 4.3478 & 4.7619 \\ \hline
    132       50  & 8.1699 & 2.4752 & 2.6882 \\ \hline
    133       100 & 8.6505 & 1.8342 & 1.7470 \\ \hline
    134       200 & 7.4738 & 1.8678 & 1.5733 \\ \hline
    135       222 & 7.0426 & 1.5657 & 1.4493 \\ \hline
    136     \end{tabular}
    137     \label{tab:result}
    138   \end{center}
    139 \end{table}
     109The algorithm is kind of 'social' e.g. it will favor the neighborhood and the
     110global optimum more than it's own local optimum. Also its active, meaning it
     111changes direction fast the moment the optimum is found somewhere else.
     112
     113Size of the local swarms is 50, each with 10 agents.
     114
     115After running 5 times, the best fitness found was $0.0000045481$. Improvement is
     116almost only shown in the first 100 iterations see figure~\ref{fig:pso-fitness},
     117afterwards it quickly stalls.
     118Trying with a smaller number more and less show the same result as seen in
     119figure~\ref{fig:pso-small}
    140120
    141121\begin{figure}[htp]
    142122  \begin{center}
    143     \subfloat[SA]{\label{fig:edge-a}\includegraphics[scale=0.4]{sa-fitness.eps}}
    144     \subfloat[MCS]{\label{fig:edge-b}\includegraphics[scale=0.4]{mcs-fitness.eps}}
     123    \subfloat[80]{\label{fig:pso-fitness}\includegraphics[scale=0.4]{pso-fitness.eps}}
     124    \subfloat[10]{\label{fig:pso-small}\includegraphics[scale=0.4]{pso-fitness-10.eps}}
    145125  \end{center}
    146126  \caption{Fitness throughout the iterations}
     
    148128\end{figure}
    149129
     130Changing various flags like walkspeed $wander$ or changing the socialness of
     131the agents does not prove to change much in the algoritm result.
     132
    150133\section{Conclusions}
    151 Looking at the graphs the fitness was still increasing so a larger iteration
    152 size would make the fitness a better result. Secondly the \emph{SA} is
    153 preforming much worse on large number than the \emph{MCS} one. It seems to the
    154 temperature function is not working as expected.
     134Giving the lack of external results of other algorithms in large scale setups
     135(80 parameters) its hard to say whether this is a good or worse preforming
     136algorithm.
    155137
    156 Both algoritms are preforming much worse then the best found solutions.
     138For furher research within the algorithm there are many knobs to tweak as well.
     139One could think of implementing a algorithm around this setup as well.
     140
    157141\section{Appendix 1}
    158142\label{app:code}
    159 \include{autocorrelation.m}
    160 \include{initseq.m}
    161 \include{mcs-call.m}
    162 \include{mcs.m}
    163 \include{mutation.m}
    164 \include{randint.m}
    165 \include{sa-call.m}
    166 \include{sa.m}
     143\include{pso.m}
     144\include{SHGa.m}
     145\include{SHG.m}
    167146\end{document}
Note: See TracChangeset for help on using the changeset viewer.