Friday, June 24, 2005

Mystified

Since the last post, I have made some modification to the algorithm for simulating the performance of the LDPC codes. For one thing, instead of fixing the number of blocks of message to encode, transmit and decode per simulation, I have fixed the total amount of message bits instead (to 100,000 bits), and the number of blocks would vary with the code length (block size).

The first resulting curve, for n=10, was beautiful! Marvellous! I thought I should be on the right track now, finally! And then, I generated the curves for n=50 and n=100, and guess what - they showed higher BER compared to the n=10 curve. Now that's baffling! Larger block sizes should give better performance! What is wrong now?

And another baffling thing - since the total amount of message bits processed is fixed for all block sizes, the simulation time should not be increasing with increasing block size. But no, it does! For n=500, it didn't complete in 1.5 - 2hrs as did the simulation for n=10, n=50 and n=100. Yesterday, it went on for 8 hrs without getting even close to completion. I see no logic behind these weird occurrences!

I am, in every way, MYSTIFIED!

No comments: