Mesh size influence.
 

As we have said it before, we performed our computations with a 5x10 mesh, pretending that our results were acceptable and that we were gaining a lot of time.
The aim of this section is to prove that these two hypothesis were right.

For this, we have build 7 different meshes containing from 50 to 5000 cells.
Then, we made the same "test" computation for each mesh.
This test was to calculate the flow for a given Rayleigh number (Ra=1854) in the same conditions.

For each of this computations, we wrote to a file the velocity that was computed at the center of the box at each iteration.

We obtained 7 files of datas that are plotted below:


 

We can see that 2 phenomemas clearly appear: first, the limit velocity (after convergence is achieved) is not the same for each mesh, and second, the number of iterations required for the convergence is not the same too.
 

1) Comparison of the limit velocities.

We can see on the previous graph that the limit velocity seem to decrease when the number of cells in the mesh is increasing.
This evolution is more obvious on the next graph:

We can see on this graph that the velocity converges to a limit when the size of the mesh increases.
This limit is roughly about 4.5x10-5 m/s, though the computations we made for the 50 cells mesh gave us a velocity of 6.5x10-5 m/s.
This difference is quite large, but as we said it above, we were interested in the behaviour of the solution (which is the same for all the mesh) and not to the result by itself.

We can make another remark on the behaviour of the velocity, if we think about the fact that when the velocity decreases, everything happens like Ra is going closer to Rac.
Actually, we can consider that, to each mesh corresponds a different critical Ra, and that when the size of the mesh increases, this critical Ra increases too.
This interpretation goes into the same direction that the remark we made into the 2nd chapter about the work of Cayrol & Perchat.

We have here an explanation of the low value we have found for Rac until now.
We can even imagine that, beeing aware of this behaviour, we could perform tests on the small 50 cells mesh, and then to bring a correction to the result by scaling it with the right factor.

 
2) Comparison of the number of iterations.
We have allready begin to see on the first graph that the number N of iteration required for the convergence was increasing when the size of the mesh was increasing.
This evolution is more visible on the next graph:

As we can see, this number N increase roughly as Ncells, where Ncells is the number of cells in the mesh.
This evolution law can seem to be quite slow, but what must be noticed to is that for a mesh containing Ncells cells, the computation time is roughly given by T=(Ncells)2.
In fact, for the last computation on the 5000 cells mesh, we had to wait for more than 4 hours until the convergence was achieved , though for the 50 cells mesh, this time was around 1 minute....
After that, when you think that the number N also increases dramatically when going closer to Rac, it is clearly not realistic to try to determine the critical Rayleigh number on such a large mesh.

In this conditions, we can even wonder how previous year students were able to determine the critical Rayleigh number for 6000 cells mesh with less than a percent of error (I want to know what is the trick, Alex !!!).
 
 
 

Conclusion:
It is now clearly obvious that, the best method for computing a critical Rayleigh number is not to take a very refined mesh (where results are better but duration of the computation is very large) but to take a smaller mesh, then to try to interpolate the real velocities thanks to a correction law and finally to determine an approximation of Rac.