After months of cluelessness, I finally figured out how to make recent comments appear in that tab you now see on the right; many thanks to the anonymous commenter! No more digging through old posts to see who's left a comment (I've occasionally discovered comments on several week old posts.) Any idea how to make that "recent comments" list longer than 5 (blogger doesn't seem to give me that option)?

Following Daniel's suggestion, I'll make this an open thread. Post your problems (open or solved), tell some jokes, start flamewars -- it's all good! [I've never had to remove a comment or "moderate" in general, but don't push me...]

A CMU undergrad (CS) asks for some generic research advice and I promise to devote a post to this shortly.

Subscribe to:
Post Comments (Atom)

## 9 comments:

test-ignore

Let's try.

Assume there is a function f(x) which is convex and unimodal.

Consider the following (evolutionary) scheme for finding the (global) maximum of f(x).

1. Let N(m0,s0) be a Gaussian density with some large variance s0. We draw, say, N samples from N(m0,s0) and take some small percentile (say 10%) of the better-valued (elite) samples. Namely, these are the samples whose "performance" f(x) is the highest among all N.

2. Next, we compute m1 and s1 as Maximum Likelihood estimates of the mean and covariance of the new Gaussian density based on these elite samples.

3. We sample from N(m1,s1) and repeat the process.

It can be shown that the mean of N(mi,si) converges to the global optimum of f(x) as i goes to infinity ("provided the sample size N is large enough")

Questions:

1. What can be said about the variance of the distribution? Does it go to zero (that's a congecture)?

2. What is the rate of convergence of the mean and variance? In terms of the elite samples number.

3. What happens if f(x) is not a nice function? Say, it is multimodal. Perphaps something can be said in terms of the sample size N and the size of the elite samples set?

Convex should in fact be concave.

Nice optimization scheme! How does it compare to, say, interior point methods? Do you know anything about the rate of converence of E[N(mi,si)] to max f(x)?

In fact, at this stage I cannot say anything more and I am not sure that any results are available elsewhere. The rate of convergence of the mean is also an open issue for me although I think it converges geometrically fast as a function of the elite sample size set. (Again, I don't have a formal proof for that). To visualize the beauty of the method consider the following Matlab code for optimization of a multimodal function (it is simple & beautiful)

----------------------------------

close all; clear all;

col = ['b','r','g','y','k','c','b','r','g','y','k','c'];

N=20; ro = ceil(log(N));

%sample size N, elite sample size - log(N)

mu = 1; eta = 7; x_opt = 0.9;

iter=30; % total number of iterations

av = 0; stdd = 20;% initial Gaussian parameters

figure;

xx=-5:0.01:5;

y=-(1+8*((sin(eta*(xx-x_opt)).^2).^2)+6*(sin(2*eta*(xx-x_opt).^2)).^2+mu*(xx-x_opt).^2);

subplot(2,1,2);hold on; grid on;

plot(xx,y,'k'); %plot the function

axis([-2,3,-15,0]);

for i=1:iter

dens = normpdf(xx,av,stdd);%current density

subplot(2,1,1);hold on; grid on;

plot(xx,dens,[col(i), '-']); %plot the density

axis([-2,3,0,3]);

x=random('norm',av,stdd,1,N);

%sample from current density

S = -(1+8*((sin(eta*(x-x_opt)).^2).^2)+6*(sin(2*eta*(x-x_opt).^2)).^2+mu*(x-x_opt).^2);

[ignore,ind] = sort(S,'descend');

elite = x(ind(1:ro));%elite samples

val_elite = S(ind(1:ro));

%elite samples performances

subplot(2,1,2);hold on; grid on;

plot(elite,val_elite,[col(i), '+']);

%plot the elite samples

av = mean(elite);%ML of the mean

stdd = std(elite);%ML of the variance

end

Ahhhh- MATLAB!!! Run for your lives...

Ahem - well the only thing I can say about MATLAB, is to never swear or curse it. Well you can verbally, but never input:'you stupid idiot draw the graph' or 'stupid dog'!

PS: I'm not going to admit to doing that, but that's the word on the street. Oh, and writing 'sorry' afterwards doesn't make things better. You grow to learn that MATLAB is a sensitive soul. OK, I'll shut up now!

Post a Comment