## Parallelizing loops in MATLAB – Nested parfor

parallelizing a loop in MATLAB could be as easy as changing:

for idx=1:N
some statements
end % of idx loop


to

parfor idx=1:N

some statements

end %of idx loop


However, you can not have nested parfor in matlab. So, the following will NOT work:

parfor idx1=1:N
parfor idx2=1:M

some statements using idx1 and idx2

end % of idx2 loop
end % of idx1 loop


An easy work around is to combine these two loops into one loop and then get the idx1 and idx2 within the loop as follow:

parfor masterIDX=1: (N*M)
[idx2,idx1]=ind2sub([M,N],masterIDX);

some statements using idx1 and idx2

end % of masterIdx loop


This can be used to combine as many nested loops as needed:

parfor masterIDX=1: (N1*N2*...*Nn)
[idxn,..,idx2,idx1]=ind2sub([Nn,..,N2,N1],masterIDX);

some statements using idx1, idx2, ..., idxn

end % of masterIdx loop


Of course, all these nested loop needs to be independent of each other, that is one should not depend on the results of the others, to parallelize them like this.

## Clearing Memory Cache

Ubuntu caches the memory. While that could be useful in some cases, it could cause problem in other cases. You can clear the cache using the following command

sync && echo 3 | tee /proc/sys/vm/drop_caches

Make sure that you have root privileges before issuing the above command. On servers, you can add that to bash script file and schedule it to be executed every now and then.

## Interpolation Extrapolation and Curve-Fitting using higher order polynomial

There are numerous cases that you want to transfer data from one location (source grid) onto another locations (destination grid). Let’s say you have measured the concentration of some pollutants along the river every 1km from the river mouth. Now if you want to know what is the concentration at a certain location lets say 3.5km from the river mouth, there is no measured data. and of course you need to interpolate your data to the location that you want to know. Likewise, extrapolation serves the same purpose with this exception that the new location that you want to know about is located outside the region you made measurements or you have data.

Interpolation reproduces the [measured] data at the known location exactly. This means that the interpolant will generate the same data at the source grid. This is used for data driven approach. However, sometimes you want to use a model driven approach. In this case, you won’t reproduce the exact data at the known location; however, you are fitting a curve (model) to your known data set on source grid and estimating the value based on the fitted curve at the destination grid.

There are many methods to perform the above task. Most of them can be shown simply as:

$f_{destination}=P \times f_{source}$

I like to think of P as a projector or a transformation that interpolates/extrapolates/transforms/projects data from a source grid onto a destination grid. What is interesting is that for most of the methods the entries in P depends only on how the points on source and destination grid are located relative to each other. Therefore, as long as the points in source and destination grid have not moved relative to each other, the matrix P is not going to be changed.

This property can be used to perform a more efficient data transfer from one grid onto another. In those cases that your source grid and destination grid is not changing but the data on the source grid is constantly changing and it needs to be projected onto the destination grid, the matrix P can be generated once and after that it is only a matrix multiplication. This happens in many cases, particularly in the solution of Partial Differential Equations (PDEs). Some software packages such as NCL, ESMF, and SCRIP call this matrix, i.e. P, as interpolation weights.

I have recently developed another package in MATLAB which constructs the matrix P based on the chosen degree polynomial in 2D, i.e.:

$f(x_d,y_d)=\sum_{i=0}^{n}{\sum_{j=0}^{n}{a_{i,j}x_d^iy_d^j}}$

n is the degree of the polynomial. If you choose n=1, it would be bilinear interpolation. If you choose n=3 it would be cubic interpolation, if you choose n=4 it would be fourth order Lagrange interpolation. Although you can choose n to be anything that you like, but remember that large n suffers from Runge-Phenomenon.

The command in MATLAB has the following form

 P=ConstructInterpolator(xs,ys,xd,yd,nPoly,nInterp)

where nPoly is the degree of the polynomial, and nInterp tells the program how many point to use to determine the coefficients. Let’s say you have chosen nPoly=4, therefore there would be $(nPoly+1)^2=25$ coefficients, i.e. $a_{i,j}$. If nInterp is set to be 25, then the program looks for the 25 closest points and uses that to determine the coefficients. Depending how the points are distributed you may end up with ill-conditioned or even singular matrices. Therefore, sometimes it is better to use more points than the minimum requirements. Therefore, if you set nInterp to something bigger than 25, in this case, the coefficients are going to be determined using a least-squeare approach. In this case you are actually curve-fitting.

This code can be used to interpolate/extrapolate/curve-fitting on 2D structured grids, unstructured grids, or even scattered data/grids/points. The source and destination grid does not need to be of the same type.

You can download the code from Mathworks file exchange (here). The code is accompanied by some Test_*.m functions. I suggest start running those codes first to get acquainted with the program.

Should you have any questions, feel free to send me an e-mail or ask it here on the blog. I do my best to answer them as quickly as possible. But sometimes I might be late.

## C, CUDA-C, and MATLAB interface to CUDA-C implementation of SEBS are available for public download now.

Finally, I uploaded the code and an example on how to use the code on code.google.com. You can access the C, CUDA-C, and MATLAB interface to CUDA-C. I am going to add some more documentation.

Just one note on MATLAB interfaces. You most probably need to first go to cu_src sub directory and issue make on your own system prior to using MATLAB interface. This will update the PTX files. I just briefly tested the MATLAB interface. It is faster than the Pure MATLAB version of SEBS. However, it is not as fast as CUDA-C version, once it is executed outside MATLAB. (well, it was expected to be like this anyway).

The code project is available on:

regards,

## slow network connection on Ubuntu

I recently got a new PC with ASUS mainboard. I installed both Windows and Ubuntu 11.10. Everything was fine except that on Ubuntu, the connection was slow, quite often disconnected, downloads from the Internet were halted without being completed or they were corrupted, and in general not performing well.

It couldn’t be the Internet connection; since, this is my PC in university connected to the Internet through a wired connection. But more importantly, once I am logged on Windows 7 64bit, the Internet and network works just fine. So, it couldn’t be a malfunction hardware either. That was our beloved Ubuntu.

So, the first step was to check if the proper drivers are loaded using the following command:

lspci -v

I scrolled through the output and found my ethernet controler. it was:

“Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B”

then I checked what kernel driver is used for that

Kernel driver in use: r8169

So, it’s clear why it was behaving funny. Wrong driver was in use. Here is how to fix it.

First you need to download the latest driver. You can get the latest driver on Google Code project (click here). At the time of this writing the latest driver was ” r8168-8.032.00.tar.bz2“.

Unpack this file, and type:

sudo make clean modules install

sudo depmod -a

followed by

sudo insmod r8168.ko

r8168.ko is in the src folder of the driver that you just unpacked and compiled. insmod as its name suggests inserts a module into the running kernel. Obviously you need to have root access for all commands and if you do not have installed the compilers do it so by “sudo apt-get install build-essentials”

Now to make this driver available at the boot time you need to issue the following command:

mkinitramfs -o /boot/initrd.img-uname -r uname -r

also add the r8168 module to /etc/modules. (Just add one line at the end by typing r8168).

Also to make sure that the r8169 is not loaded, add “blacklist r8169” to /etc/modprobe.d/blacklist.conf.

You should be good to go.

## exFAT for your USB Flash Drives

It is not that uncommon to use multiple different operating systems. I personally use Mac, Linux, and for some applications also Windows. In these situations it is not uncommon that you want to transfer file between different computers with different operating systems. One commonly used method is using flash drives. (I know even today with all the cool networking possibilities).

Flash drives are getting cheaper these days and they are available in much higher capacities. My biggest flash drive is 16GB and even that one is sometimes full. Anyway, you need to format the flash drive using a file system that is available on all operating systems (OSes). FAT is a good choice, but remember that its performance and effective space decreases by increasing the flash drive capacity. Furthermore, if you have large files (bigger than 4GB) you can not store them using FAT file system. (Yes, it happens for a file to be more than 4GB quite easily these days).

NTFS is quite efficient on large volume sizes, provides additional security (at least on Windows) and can store large files; even the file size can be quite much bigger (way much bigger) than the biggest hard drive available on the market at the time of this writing. :D. So, no shortages on that front. However, it is not widely supported on other OSes. MAC OS only supports reading from a NTFS drive but not writing to it. You can get NTFS-3G or other 3rd-party drivers to enable writing to a NTFS drive (check my other posts). However, I noticed that it’s performance has gone awry on Mountain Lion to the point that it is slowing down the entire system. (It keeps reading and reading from the NTFS drives and slowing down the entire system). Perhaps, the commercial version does not have these problems. But once there are free options, why should we pay. After all we are poor students.

The replacement could be exFAT file system. This file system has been around for quite a while, (since Windows XP and CE 6.0). It is naturally supported by MacOS 10.6.5 and later (both read and write permission, by the way, if you haven’t upgraded your MAC OS already, do it so), and it supports large file sizes (up to 16 EiB, I think this should be enough for most of us for quite some time :D, let me know if you have larger file size :D).

The only thing is that it is not supported on Linux, right out of the box. But don’t worry. As usual a very cheap (this means free) solution is there. As usual I will only explain how to support read and write from/to exFAT only on Ubuntu and debian.

The only thing that you need to do is to install exFAT support using apt-get as follows:

sudo add-apt-repository ppa:relan/exfat
sudo apt-get update
sudo apt-get install fuse-exfat exfat-utils

That was it. You are done. Enjoy your exFAT formatted drive on debian linux as well as on MAC and Windows.

Posted in Tech Tips | 2 Comments

## Animating Graphic Files

If you are doing any numerical simulation or modeling, it happens quite often that you want to show a series of your solution snapshots one after each other to make an animation on how the solution is evolving. There are different software available to do this, but we are going to limit ourselves to the freely available one on Linux and MAC.

Even if you are not animating graphics, but you want to perform a certain task on multiple graphic files or change their format this post gonna be helpful.

Using ImageMagick:

I had another post about ImageMagick. You can install ImageMagick on Ubuntu as follows:

sudo apt-get install imagemagick

or use the “port” command if you are on Mac. ImageMagick provides you with a command prompt command “convert” that you can perform several operation on graphic files. For example, if you want to convert a PostScript file into a JPEG file and rotate it by 90 degree counterclockwise you must issue:

convert -rotate -90 inputfile.ps outputfile.jpg

So, now let’s say you have one file per day for a year, let’s say you have created a map of air temperature at 10:00am and you have named them yyyy_ddd.ps, where, yyyy is the year, and ddd is the day of the year, such as 2010_1.ps, 2010_2.ps and so on. So you can use the for loop and convert as follows:

 for ((i=1;i<366;i++)) do echo day: $i convert -rotate "-90" 2010_$i.ps 2010_$i.jpg done  If you have numbered the files and padding it with zeros, let’s say 2010_001.ps, 2010_002.ps and so on, you can use the following code snippet  for ((i=1;i<366;i++)) do echo day:$i convert -rotate "-90" 2010_printf "%03d" $i.ps 2010_printf "%03d"$i.jpg done 

Using ffmpeg:

You can install this package using apt-get as follows:

sudo apt-get install ffmpeg

or use the equivalent port command on Mac. Now if you want to animate your jpg files you can issue:

ffmpeg -r fps -i /InputFiles/Prefix%3d.jpg  Output.avi

where fps is the number of frame per seconds.