Retrospective Notebook: This entry was not written on this day, but derived later from working notes I made that day.
Began working on the biobrick simulator HUD. This would be a little display screen that sits in one corner of your second life window, and is where a huge part of the biobrick simulator 'lives'. It gives access to levels, parts, biobrick device building tools, unbind commands, help files, all sorts of guidance.
I knew that I would need to display all sorts of textual information on this display, so I began looking at a whiteboard object that someone else in the team had found. This whiteboard (called xyzzy-text, and available for free) could take a message sent to it by a user, and translate that into a series of textures on its surface encoding the letters of the message. This is *extremely* cool.
It actually works using only a single texture (image file), containing all the letters of the alphabet, and symbols for punctuation besides. Then, based on the input sentence, it calculates which part of that image has the symbol we want to display, and transfers those coordinates to SL's rendering system. The script actually controls a number of display tiles, arrayed in rows columns, giving each the coordinates for one character in the sentence we wanted to display.
I wound up not using very much of the xyzzy-text code, since I knew I would have to edit it a lot to implement the features the biobrick simulator HUD needs, all while not using most of the features it already had. It made more sense to start from a the minimum of what I would require; I did keep a few critical pieces of the whiteboard though: texture with all the characters on it, and the function to calculate the coordinates in that texture from a string of letters. These two parts were crucial for displaying text in the HUD, and it would have taken me a very very long time to develop anything as good.
Xyzzy-text actually has a more advanced text display function that would have been nice to use, but that I wasn't able to understand whatsoever. The system I borrowed displays a single letter per tile, so on the biobricker's 8x15 display I can have up to 120 letters or symbols. But xyzzy-text actually has a second more advanced text display system, that displays *two* characters per tile, basically doubling the resolution. This makes much better use of your screen's real estate, and the HUD could definitely use a few more letters here and there! But the cost of this system is greater complexity: each tile can only display a single texture at a time (or a small part of one, as the simple xyzzy-text system uses). So, we can't simply take the part of the texture for letter A and glue it to the part for letter Z. To display two characters per tile, we need to generate textures with every single possible two character permutation! And then, we need to write code to break the input sentence into pairs of letters, compute which texture and at what location inside that texture has the two letter pair we're looking for, and send those coordinates to the tile object for display.
I barely understood how the simple system worked, so trying to make use of the advanced xyzzy text display system was out of the question. But I've learned a lot since early summer, and the limit of 13-14 characters per name for objects is becoming a limitation (how many letters in 'Repressilator'? 13, phew! How about 'TetR Repressable Promoter'?) So, I'll put this on the wishlist for later.