The Units Of Quant Innovations: Workflows

Workflows (sequences of tasks) are responsible for the progressive problems (with critical moments) nature of quant innovations. Especially if they do not follow the one size (model, method, implementation…) fits all strategy. Sequences, as all other units need a constructor, are characterized by progressive problems and need solutions.

Process optimization

Let us make it practical. Provide we want to optimize a process that has process steps thats analytic modeling is not feasible - technically or economically. The paper making process for example.

I've conducted a project where we used machine learning to create interpretable and computational models from data. In concrete we used the multi-strategy, multi-model machine learning framework (mlf)…we've developed to perform such comprehensive projects. Its built of several engines (realized in C++) that implement a symbolic machine learning language (built atop the Wolfram Language).

Paper production is a complex flow process with several process levels and hundreds of process parameters that potentially influence the quality of paper, assessed by dozens of quality measurements.

The most important object of desire is controlling the process towards the optimal quality. But this is too ambitious…prediction is possible.

The first thing we decided was to wrap the language front-end of mlf with a GUI and combine it with Excel…supporting an expert and a prediction interface.

The first task sequences we needed were those that curated the data (more formally) and asked the user to select the "best" parameters. In this workflow we used already (fuzzyfied) decision tree methods helping the expert to eliminate the "pink noise" and emphasize on data that influence most, interactively…

In the learning workflow we sequenced tasks for strategy selection and model generation. Model selection tasks can be parallelized, but to detect the best model for the objective a lot of cross-model testing needs to be carried out and guided by the expert (progressive problems).

Resolution: to make the output intuitive we've provided graphical representations…

The final workflow deals wit the prediction - the selected models are transformed into a computational representation - fed with concrete data they predict quality parameters.

In all workflows we use tasks, like "Create", "Select", "Apply"...Data, Specifications, Predicates, Models…meaning: build the constructors, manage the progressive problems and solve...in each granularity.

Users that can trust in such an innovation want interpretability and prediction accuracy - consequently the models need to be understandable, computational and validated. They want to integrate machine learning into the preprocessing workflow and understand and compare results intuitively in the learning, as well as the prediction workflow.

Remember, if the tasks do not work the whole system doesn't. And this is far more ambitious as selecting the right machine learning technology for certain subproblems.

Powerful functions for routine tasks, automated model testing, advanced visualization…all carefully structured and designed are the key of innovations using "industrial machine learning".

Workflows change in sequences of changes. Their changes are more explicit…

Writer and editor in one

There's one more thing to mention here: I emphasize on the analogies between story and quant innovation, but to develop an innovation like the above process optimization system…you need a technology stack to build it atop. And that means: as innovator you are a writer and an editor in one.

And remember, first we build the tools than they build us (McLuhan).

We've once solved the Goats, Wolves and Lion puzzle by different approaches. Sascha Kratky, has programmed a little fairy tale about magic forests here - it confirms the above statement.