Journal Updates
As part of the Canadian Distributed Mentorship Program, here is where I will be posting my journal entries relating to the work done.
I - | II - | III - | IV - | V - | VI - | VII - | VIII - | IX- | X- | XI- | XII- | XIII- | XIV- | XV- | XVI- | XVII- | XVIII |
Week Twelve: July 17 - July 21
Mmmm, it sure sucks to be back. By the weekend, I had just reached the optimal "I'm too lazy to even move off the couch" momentum. It's alarming how quickly inertia can get ahold of you.
After a few cups of coffee and the initial shock of "ew, this code looks ugly", I decided to take the first day and clean the code up. With the pretiffied code, and the input probability distributions ready for me from last week, all I had to do was implement the iterative algorithm and see what results it produces. Well, land of the run time errors, subculture of one, is what it produced. Correctly indexing about 10 different arrays in about twice as many nested for loops is no task for the weak of heart.
Onward and upward! I had a lot of bugs to fix, some of which were typing errors and could be left to monkeys to find. Others weren't as easy, and caused the system of equations to never converge. Among these was the way I was calculating the p(past) distribution, which I explained last week. Fixing this did solve some problems, such as convergence, but the code still acted strangely. For example, there are a lot of KL-distances being calculated in the equations, and these would turn up to be negative. The Kullback Leibler(KL) distance is a metric for calculating the distance between a true probability distribution to a target one. In other words, it calculates what is lost if a wrong distribution, or in Information Theory terms, how many additional bits of information are needed...Anywhoo, the KL distance is defined as being positive. Getting negative results is, by all means, offendingly incorrect.
By the end of the week, I had tackled the negative KL distance problem, and the code converged. However, the algorithm systematically picked the same action as being the optimal one, namely the float, and did so after one iteration. Both of these results tickled my spidey senses: very rarely does something converge this easily and this fast.
It is with great dissapointment that I notice a decrease in the quality of my entries. Terrible code is taking its toll, and I am becoming bitterly passive aggressive towards it. Since I am spending most of my time debugging, deep down at the bottom of the programming food chain, I have no time to learn anything interesting. The other day, my lab-mate politely asked me "how things were", only to be faced with a wave of expletives. At this time "things" are converging too fast to be true...:)