Moose philosophy

Moose is an analysis platform that allows us to

Assessment is defined by the Webster dictionary as the evaluation or estimation of the nature, quality, or ability of someone or something. In other words, assessment is what we do when we want to know, and it is what we should to do before taking a decision.

Assessment is an act of reasoning. If we want to know, we have to get to know. We cannot get knowledge implanted into our head. For now, if we want to know, we have to learn. We have to gather the facts, understand their structure and relationships, and interpret them from the point of view of the problem at hand.

An important challenge is posed by large amounts of data. When we cannot grasp all details in a short amount of time, we require tools that do the grunt work for us and provide us with the accurate overview required for interpreting the situation.

However, not any tool does the job. Many tools exist out there to deal with various aspects of data and software analysis, but most of them take the oracle way: they offer some predefined analyses that provide answers to standard questions. That is great when you have a standard question. However, it turns out that most of the time our questions are not quite standard. In these situations, regardless how smart the analysis is, it is of little use for the analyst.

Moose explicitly tackles this problem with a philosophy derived from one core principle: treat the analyst as an intelligent human being.

First, Moose allows the analyst to control the flow of analysis as much as possible. While multiple ready-made tools are available it is the analyst that decides the route of the analysis. This is achieved through an object-oriented design:

  1. data is exposed at all times,
  2. the analyst can choose from the various tools that can be applied to manipulate the data, and
  3. the resulting data is again made available to the analyst.

Following this approach, the answer to custom questions can be composed out of smaller answers that can be obtained from standard tools.

Second, Moose puts emphasis on interactive presentation. When it comes to large amounts of details, visualization is an important tool for grasping the structure of the data at a glance. For this reason, multiple visualizations are provided to expose common patterns. Good pictures can offer insights, but to be able to drill and learn further, interaction is crucial. To this end, the complete user interface of Moose is interactive.

Third, and most important, Moose is not just a tool. Moose is a platform. The available tools are valuable, but they are often not sufficient either because they require customization or because they miss features. Moose comes with an infrastructure that can be used to tweak the existing tools or build custom new ones altogether. This infrastructure spans multiple dimensions starting from advanced parsing and meta-modeling capabilities, to building complete interactive tools. Add to all these the scripting possibilities and you get a fast prototyping environment in which new questions can be encoded cheaply and incrementally.

All in all, Moose offers a fresh perspective on how assessing software systems, or data in general, can and should be.

For example, various reports claim that up to 50% of the development effort is actually spent on understanding code. Given that 50% is quite large, you would expect a significant amount of tools to be geared towards this side. In practice however, developers mostly rely on code reading. Modern IDEs do provide some help, but software systems are large and complex, and thus relying on reading them just does not scale when we want to understand them as a whole. For example, for a person that reads one line in two seconds, it would take approximately one month of work to read a quarter of a million lines of code. And this is just for reading the code. Even when more advanced tools are used, the current software development culture tends to promote tools as being something to conform to.

Moose turns this state of facts upside down by empowering the development team to take complete control over the needed tools. And then it makes the process practical by bringing the cost of building a tool as close to zero as possible.

Granted, this switch is not straightforward. It took regression testing a decade to get established as a development practice. Testing used to be an activity for lesser programmers. It was tedious, it was repetitive, it involves many little details, it was everything a respectable programmer would hate. Now, you know a respectable programmer by the tests he writes.

The battle was not easy, but it was worth it, although you can still find skeptics pointing their finger to the costs of writing tests, and how this takes away from overall effort that could otherwise be spent on active programming. Indeed, at first writing tests does involve extra costs, but we know they save us great pain later on because we know that as we write more code we unwillingly break assumptions from the existing code. In other words, we favor the possibility of not being able to control all the details of the system, and thus we ensure that the computer will be able to do it for us.

Regression tests are indeed excellent assessment tools, but they concern only an aspect of software, namely the behavior. The internal quality of the structure of the system is as important. The interaction between the different technologies used is important. The traceability of features is as important. The conformance to external guidelines and constraints is as important. The performance is as important. The security issues can be as important. Even the cleanness of the build system can be an important issue. And they all require active and continuous assessment.

Given the large costs at stake, we need a more prominent role for assessment during development. Moose makes this possible and practical. It just takes you to take the step.

Posted by Tudor Girba link