Team:Berkeley Software/BingNotebook/Notes
From 2009.igem.org
(→Sixpi 18:23, 19 June 2009 (UTC)) |
|||
Line 32: | Line 32: | ||
[[Image:BerkeleySoftware_clothoDBinfra1.png|center|frame|500px|Clotho Data]] | [[Image:BerkeleySoftware_clothoDBinfra1.png|center|frame|500px|Clotho Data]] | ||
+ | |||
+ | ==Week Four== | ||
---- | ---- | ||
[[Team:Berkeley_Software/Notebook|Back to Notebooks]] | [[Team:Berkeley_Software/Notebook|Back to Notebooks]] |
Revision as of 00:10, 24 June 2009
Contents |
Week One
Sixpi 18:34, 2 June 2009 (UTC)
iGEM starts again for me! Last year I was on the Berkeley wetlab team - this year, I'll be working on the software tools team. This week, we're getting a bunch of tutorials from Doug to get everyone caught up with where Clotho is now, and also thinking about which of the subproject(s) we want to talk about. I will probably be working on database support, and I would like to help with the robot automation/assembly algorithm project. I've put up a short bio of myself here.
Sixpi 15:55, 3 June 2009 (UTC)
I talked with Doug yesterday about how the DB manager should work for the new Clotho. The new DB manager won't be a rewrite of what I have now - what I have now should probably be renamed the DB browser. The new DB manager will be the default gateway for information to travel in and out of Clotho, so we want it to be as flexible and powerful, and yet as easy to use as possible. Out of our discussion, we decided that a binding file of some sort was still needed, and that we needed some list of internal types/keywords that all Clotho tools should understand, and a binding file to translate between external types and internal types. Once data passes through the binding manager, another piece of code will be responsible for checking the external data model to see if it is compatible enough with the Clotho data model to let the tools do their work. We will need to consider some things about how different the topologies of the external and internal data models are, and how different the models can be before the tools will not be able to traverse the external data intelligently.
Week Two
Sixpi 18:02, 9 June 2009 (UTC)
Yesterday we had our general meeting of the week, where we decided who would be doing their minipresentations when next week. I'll be going on Monday. Our last team member, Lesia, arrived in the afternoon and started getting acquainted with Clotho.
Nina and I also went to the wet lab again to talk to Jenn about more robot details. We figured out a few more things the robot automation piece will need to do: keep track of the amount of DNA needed to finish a set of assemblies and warn the user when there isn't enough sample left, and also we need to break the protocol used in Chris' lab down into pieces that are sufficiently general and configurable so they can be reused to easily create new robot automation protocols.
Sixpi 18:20, 12 June 2009 (UTC)
I had a meeting with Doug about the new data model on Wednesday, and he liked my idea of having a core set of keywords and then having a set of extension keyword files. Some features Doug want include automatic connectivity detection and alternate path usage. I should also come up with some toy examples we can refer to, instead of always drawing random diagrams that may or may not be relevant.
I should also be helping Nina get something working for the robot automation project this weekend for her deadline next Friday.
Week Three
Sixpi 06:27, 15 June 2009 (UTC)
I've done some work this weekend on filling out a data model that will be compatible (more or less) with the current JCA database and parts of the JBEI. I've also been researching existing persistence frameworks for java, and whether any of them are suitable to for use on this project. I'll post more after my minipresentation during the meeting tomorrow.
Sixpi 18:23, 19 June 2009 (UTC)
At the database meeting this week, I presented a complete high level picture of how the new Clotho will bring in data from the database. We will have a set of files that define the objects the rest of Clotho want to use, and what fields these objects should have. These files will get parsed and some internal structure of a data model will get built. The user will specify some database to connect to, and then a mapping module will use metadata from the database and user input to map database table names and column names to Clotho object and field keywords. The user will also specify the connectivity of the objects at this point. The mapped data model structure will be used by a code generator to build a java class for each object, along with a hibernate mapping file. These classes will all extend the Datum base class so the rest of Clotho can use them. The classes will be compiled dynamically, and then (hopefully) everything will work.
We also came up with an API for both the core and the Datum objects.