Yesterday, we celebrated, what Seth Godin calls Ruckusmaker Day - in honor of the 60th birthday of Steve Jobs. I agree, more should speak more about their ideas.
One day before the Ruckusmaker Day I wrote in UnRisk Insight: You're a Genius. We all have creative minds. We should talk about them, but we should not only be storytellers, but what Navi Modri (and others?) call "storydoers".
It's exciting…and worthwhile.
The Units Of Quant Innovations: Subsystems
A subsystem is independent or an add-on. An add-on is built atop another subsystem. But they're closed in the system sense.
Subsystems in quant innovations are usual representing process steps or individual flows in a multi flow system - from a slab to cold rolled coils through plate mills, strip mills, cooling section, cold rolling, coating…or the thermo-flow subsystem.
In quant finance we have financial instruments divided into asset classes (equity, interest rates, currencies, inflation commodities…) with pricing, risk management and accounting flows. Typically one would see pricing, valuation and risk analytics systems with special add ons for VaR calculations, xVA…configured to subsystems for trading, treasury, asset management, risk management…
A subsystem represents a a major stage in a system and is responsible for a major change.
It must have constructors, manage progressive problems and provide insightful solutions that can be further evaluated…
The VaR Universe of UnRisk is built atop UnRisk-Q (in UnRisk Quant) or atop the UnRisk FACTORY in UnRisk Capital Manager. In general it's an instantiation of the general portfolio across scenario simulation with the required practical details. It offers all relevant methodologies (parametric, historic, Montecarlo), calculates VaRs across the instruments in a portfolio and risk factors. It serves all kinds of tests…it uses the pricing and calibration functions as well as the tasks of UnRisk-Q intensively.
But ist has its own constructors, manages progressive functions and provides solutions with cascades of results…
It works so well, because the functions, the tasks and their own building blocks work so well.
Because subsystems are systems there's no need to treat them extra...so, this concludes the post about the units…
In the next time I'll walk through the scheme again in order to find potential for improvement. And I'll maybe dive a little deeper into examples.
Subsystems in quant innovations are usual representing process steps or individual flows in a multi flow system - from a slab to cold rolled coils through plate mills, strip mills, cooling section, cold rolling, coating…or the thermo-flow subsystem.
In quant finance we have financial instruments divided into asset classes (equity, interest rates, currencies, inflation commodities…) with pricing, risk management and accounting flows. Typically one would see pricing, valuation and risk analytics systems with special add ons for VaR calculations, xVA…configured to subsystems for trading, treasury, asset management, risk management…
A subsystem represents a a major stage in a system and is responsible for a major change.
It must have constructors, manage progressive problems and provide insightful solutions that can be further evaluated…
The VaR Universe of UnRisk is built atop UnRisk-Q (in UnRisk Quant) or atop the UnRisk FACTORY in UnRisk Capital Manager. In general it's an instantiation of the general portfolio across scenario simulation with the required practical details. It offers all relevant methodologies (parametric, historic, Montecarlo), calculates VaRs across the instruments in a portfolio and risk factors. It serves all kinds of tests…it uses the pricing and calibration functions as well as the tasks of UnRisk-Q intensively.
But ist has its own constructors, manages progressive functions and provides solutions with cascades of results…
It works so well, because the functions, the tasks and their own building blocks work so well.
Because subsystems are systems there's no need to treat them extra...so, this concludes the post about the units…
In the next time I'll walk through the scheme again in order to find potential for improvement. And I'll maybe dive a little deeper into examples.
The Units Of Quant Innovations: Workflows
Workflows (sequences of tasks) are responsible for the progressive problems (with critical moments) nature of quant innovations. Especially if they do not follow the one size (model, method, implementation…) fits all strategy. Sequences, as all other units need a constructor, are characterized by progressive problems and need solutions.
Process optimization
Let us make it practical. Provide we want to optimize a process that has process steps thats analytic modeling is not feasible - technically or economically. The paper making process for example.
I've conducted a project where we used machine learning to create interpretable and computational models from data. In concrete we used the multi-strategy, multi-model machine learning framework (mlf)…we've developed to perform such comprehensive projects. Its built of several engines (realized in C++) that implement a symbolic machine learning language (built atop the Wolfram Language).
Paper production is a complex flow process with several process levels and hundreds of process parameters that potentially influence the quality of paper, assessed by dozens of quality measurements.
The most important object of desire is controlling the process towards the optimal quality. But this is too ambitious…prediction is possible.
The first thing we decided was to wrap the language front-end of mlf with a GUI and combine it with Excel…supporting an expert and a prediction interface.
The first task sequences we needed were those that curated the data (more formally) and asked the user to select the "best" parameters. In this workflow we used already (fuzzyfied) decision tree methods helping the expert to eliminate the "pink noise" and emphasize on data that influence most, interactively…
In the learning workflow we sequenced tasks for strategy selection and model generation. Model selection tasks can be parallelized, but to detect the best model for the objective a lot of cross-model testing needs to be carried out and guided by the expert (progressive problems).
Resolution: to make the output intuitive we've provided graphical representations…
The final workflow deals wit the prediction - the selected models are transformed into a computational representation - fed with concrete data they predict quality parameters.
In all workflows we use tasks, like "Create", "Select", "Apply"...Data, Specifications, Predicates, Models…meaning: build the constructors, manage the progressive problems and solve...in each granularity.
Users that can trust in such an innovation want interpretability and prediction accuracy - consequently the models need to be understandable, computational and validated. They want to integrate machine learning into the preprocessing workflow and understand and compare results intuitively in the learning, as well as the prediction workflow.
Remember, if the tasks do not work the whole system doesn't. And this is far more ambitious as selecting the right machine learning technology for certain subproblems.
Powerful functions for routine tasks, automated model testing, advanced visualization…all carefully structured and designed are the key of innovations using "industrial machine learning".
Workflows change in sequences of changes. Their changes are more explicit…
Writer and editor in one
There's one more thing to mention here: I emphasize on the analogies between story and quant innovation, but to develop an innovation like the above process optimization system…you need a technology stack to build it atop. And that means: as innovator you are a writer and an editor in one.
And remember, first we build the tools than they build us (McLuhan).
We've once solved the Goats, Wolves and Lion puzzle by different approaches. Sascha Kratky, has programmed a little fairy tale about magic forests here - it confirms the above statement.
Process optimization
Let us make it practical. Provide we want to optimize a process that has process steps thats analytic modeling is not feasible - technically or economically. The paper making process for example.
I've conducted a project where we used machine learning to create interpretable and computational models from data. In concrete we used the multi-strategy, multi-model machine learning framework (mlf)…we've developed to perform such comprehensive projects. Its built of several engines (realized in C++) that implement a symbolic machine learning language (built atop the Wolfram Language).
Paper production is a complex flow process with several process levels and hundreds of process parameters that potentially influence the quality of paper, assessed by dozens of quality measurements.
The most important object of desire is controlling the process towards the optimal quality. But this is too ambitious…prediction is possible.
The first thing we decided was to wrap the language front-end of mlf with a GUI and combine it with Excel…supporting an expert and a prediction interface.
The first task sequences we needed were those that curated the data (more formally) and asked the user to select the "best" parameters. In this workflow we used already (fuzzyfied) decision tree methods helping the expert to eliminate the "pink noise" and emphasize on data that influence most, interactively…
In the learning workflow we sequenced tasks for strategy selection and model generation. Model selection tasks can be parallelized, but to detect the best model for the objective a lot of cross-model testing needs to be carried out and guided by the expert (progressive problems).
Resolution: to make the output intuitive we've provided graphical representations…
The final workflow deals wit the prediction - the selected models are transformed into a computational representation - fed with concrete data they predict quality parameters.
In all workflows we use tasks, like "Create", "Select", "Apply"...Data, Specifications, Predicates, Models…meaning: build the constructors, manage the progressive problems and solve...in each granularity.
Users that can trust in such an innovation want interpretability and prediction accuracy - consequently the models need to be understandable, computational and validated. They want to integrate machine learning into the preprocessing workflow and understand and compare results intuitively in the learning, as well as the prediction workflow.
Remember, if the tasks do not work the whole system doesn't. And this is far more ambitious as selecting the right machine learning technology for certain subproblems.
Powerful functions for routine tasks, automated model testing, advanced visualization…all carefully structured and designed are the key of innovations using "industrial machine learning".
Workflows change in sequences of changes. Their changes are more explicit…
Writer and editor in one
There's one more thing to mention here: I emphasize on the analogies between story and quant innovation, but to develop an innovation like the above process optimization system…you need a technology stack to build it atop. And that means: as innovator you are a writer and an editor in one.
And remember, first we build the tools than they build us (McLuhan).
We've once solved the Goats, Wolves and Lion puzzle by different approaches. Sascha Kratky, has programmed a little fairy tale about magic forests here - it confirms the above statement.
Digital Reality
This post has been inspired by Neil Gershenfeld's post at Edge. It's about digital fabrication.
Fab labs
I've been responsible for factory automation software in Austria's largest industry enterprise. Then I worked about a year for the City of Linz, helping to establish an authority…serving companies…
At that time I had the idea of building a "walk in center for discrete manufacturing". A lab at industrial scale, with a manufacturing center for complete operations of multi-axes milling and lathing in one set up, sheet metal cutting and forming machines...all supported by CAD/CAM…it was a kind of fab lab idea…inviting manufacturers to learn by fabricating of their most complicated partsn at brand-new manufacturing systems. It was 1988…too early…the equipment was (too) expensive for a lab type of work.
The fab lab is a USD 100.000 investment…there's 400 now world-wide…and the number is growing fast.
Digital fabrication
Digital communication is about sending signals as symbols not as waves. It's much more failure tolerant - you can communicate reliably, even though the communication medium is unreliable.
Digital computing is about storing and manipulating symbols. Again it can compute reliably with an unreliable computing device.
The property that enables this kind of failure tolerance: a linear increase in the symbol gives you an exponential reduction in failure.
Is digital fabrication the integration of computers and machines? It's not. The design may be digital but the machines works analog…they cut, form, remove, polish…material. In most cases, there's no such thing as a digital material built of "symbols" and building blocks. If there were one, we could decompose and recompose it.
But the real geometry is external...it's result of a machine model. And with the complexity of the part the probability of failures increase. You can't make reliable parts on unreliable machines. If we had building blocks the accuracy come from them and failures could be reduced at the building phase…
Usual fabrication is not like biological fabrication…but we could think of digitizing fabrication and coding construction. Make things of much smaller building blocks…assembling them instead of cutting, forming, removing…
It may be easier when making complex chemicals, electronic devices…but build molecular assembles for metal parts of arbitrary shape and size? Composite materials are usually "hand crafted" to parts.
However, symbolic computation helps designing a fabrication the bottom up fashion and discretization to solve a wider range of principle problems…although often just serving as constraints of the external system (the geometry, mechanisms…).
I've conducted exciting projects for the manufacturing industries and the latest technologies help to make manufacturing systems more autonomous, make more operational steps in one set up…but, I agree, we haven't achieved the age of digital fabrication.
Fab labs
I've been responsible for factory automation software in Austria's largest industry enterprise. Then I worked about a year for the City of Linz, helping to establish an authority…serving companies…
At that time I had the idea of building a "walk in center for discrete manufacturing". A lab at industrial scale, with a manufacturing center for complete operations of multi-axes milling and lathing in one set up, sheet metal cutting and forming machines...all supported by CAD/CAM…it was a kind of fab lab idea…inviting manufacturers to learn by fabricating of their most complicated partsn at brand-new manufacturing systems. It was 1988…too early…the equipment was (too) expensive for a lab type of work.
The fab lab is a USD 100.000 investment…there's 400 now world-wide…and the number is growing fast.
Digital fabrication
Digital communication is about sending signals as symbols not as waves. It's much more failure tolerant - you can communicate reliably, even though the communication medium is unreliable.
Digital computing is about storing and manipulating symbols. Again it can compute reliably with an unreliable computing device.
The property that enables this kind of failure tolerance: a linear increase in the symbol gives you an exponential reduction in failure.
Is digital fabrication the integration of computers and machines? It's not. The design may be digital but the machines works analog…they cut, form, remove, polish…material. In most cases, there's no such thing as a digital material built of "symbols" and building blocks. If there were one, we could decompose and recompose it.
But the real geometry is external...it's result of a machine model. And with the complexity of the part the probability of failures increase. You can't make reliable parts on unreliable machines. If we had building blocks the accuracy come from them and failures could be reduced at the building phase…
Usual fabrication is not like biological fabrication…but we could think of digitizing fabrication and coding construction. Make things of much smaller building blocks…assembling them instead of cutting, forming, removing…
It may be easier when making complex chemicals, electronic devices…but build molecular assembles for metal parts of arbitrary shape and size? Composite materials are usually "hand crafted" to parts.
However, symbolic computation helps designing a fabrication the bottom up fashion and discretization to solve a wider range of principle problems…although often just serving as constraints of the external system (the geometry, mechanisms…).
I've conducted exciting projects for the manufacturing industries and the latest technologies help to make manufacturing systems more autonomous, make more operational steps in one set up…but, I agree, we haven't achieved the age of digital fabrication.
The Most Important Units of Quant Innovations: Tasks
Remember, according to the constructor theory, the most fundamental components of reality are entities - constructors - that perform particular tasks, accompanied by a set of conventions that define which tasks are possible for the constructor to carry out. Tasks are descriptions of transformations or changes. Constructor theory can help to generalize theories like that of computation, unifying formal statements, expressing the principles of testability and computability...
Tasks created by functions
I look into it through the lens of a practical problem solver (with a mathematical background) seeking computability. A task in computing is a description of transformations of objects...or even a (one task) workflow. In programming we talk about task oriented languages if their statements express what to do and not how and we usual expect them to be domain-specific (offline robot programming - a prototype). Symbolic programming is a programming paradigm that supports task orientation perfectly.
My reference of a language that empowers task oriented (symbolic) programming is the Wolfram Language . We extended it into the universe of quant finance and made it domain specific - UnRisk Financial Language. See an example here.
In batch processing tasks are scheduled, but they can be also performed by interaction.
Event based modeling (and analysis)
In how innovations change our lives I've emphasized the structure type. In quant innovation we often deal with flow-oriented (complex) systems that are event driven - objects are transformed and interact triggered by external and internal events. It may be a financial instrument in a portfolio that is called (by the issuer), a few parts that need to be assembled to a function complex, a robot that need to avoid a collision…tasks in such frameworks have a time dimension.
They need informative data (preprocessed), information about events, models, algorithms, event detection algorithms, knowledge representations…actions (to be taken immediately after an event occurred).
The focus is on real-time decision support. In event based modeling the language as well as the databases need to know events. Some tasks may contain a description of the behavior of a "decision maker"…it's important to find the right degree of user intervention.
A task is like a scene
Tasks are the basic building blocks of a quant innovations (as scenes are the basic building blocks of a story). If the tasks don't work (solve problems and attract users) the innovation doesn't. It must offer a clear shift (a change) throughout its flow. If it doesn't the programming language, the data structure, the models, algorithms, event model…may be brilliant…it will not sell.
Focus on tasks! A task must move value states in sequences (create workflows). From instrument values to portfolio values…from prices to risk spectra…from VaR Deltas to the portfolio VaR and the contribution VaR of instruments…from the counter party portfolio nettings to Value Adjustments…
Get the users hooked
The tasks need to overcome internal conflicts (offer the required regime switches…say, to models that can treat negative interest rates correctly…) and handle possible external conflicts of, say, users (say, front office vs. risk quant…investment manager vs regulator…).
If the task has a time dimension it must move objects and actors…
The quant innovation must not manipulate users, but motivate them…to test extreme (pathologic) cases, validate models and methods, walk through the what if scenarios, back and stress test…
In one of the next posts, I'll come to workflows (sequences of tasks).
Tasks created by functions
I look into it through the lens of a practical problem solver (with a mathematical background) seeking computability. A task in computing is a description of transformations of objects...or even a (one task) workflow. In programming we talk about task oriented languages if their statements express what to do and not how and we usual expect them to be domain-specific (offline robot programming - a prototype). Symbolic programming is a programming paradigm that supports task orientation perfectly.
My reference of a language that empowers task oriented (symbolic) programming is the Wolfram Language . We extended it into the universe of quant finance and made it domain specific - UnRisk Financial Language. See an example here.
In batch processing tasks are scheduled, but they can be also performed by interaction.
Event based modeling (and analysis)
In how innovations change our lives I've emphasized the structure type. In quant innovation we often deal with flow-oriented (complex) systems that are event driven - objects are transformed and interact triggered by external and internal events. It may be a financial instrument in a portfolio that is called (by the issuer), a few parts that need to be assembled to a function complex, a robot that need to avoid a collision…tasks in such frameworks have a time dimension.
They need informative data (preprocessed), information about events, models, algorithms, event detection algorithms, knowledge representations…actions (to be taken immediately after an event occurred).
The focus is on real-time decision support. In event based modeling the language as well as the databases need to know events. Some tasks may contain a description of the behavior of a "decision maker"…it's important to find the right degree of user intervention.
A task is like a scene
Tasks are the basic building blocks of a quant innovations (as scenes are the basic building blocks of a story). If the tasks don't work (solve problems and attract users) the innovation doesn't. It must offer a clear shift (a change) throughout its flow. If it doesn't the programming language, the data structure, the models, algorithms, event model…may be brilliant…it will not sell.
Focus on tasks! A task must move value states in sequences (create workflows). From instrument values to portfolio values…from prices to risk spectra…from VaR Deltas to the portfolio VaR and the contribution VaR of instruments…from the counter party portfolio nettings to Value Adjustments…
Get the users hooked
The tasks need to overcome internal conflicts (offer the required regime switches…say, to models that can treat negative interest rates correctly…) and handle possible external conflicts of, say, users (say, front office vs. risk quant…investment manager vs regulator…).
If the task has a time dimension it must move objects and actors…
The quant innovation must not manipulate users, but motivate them…to test extreme (pathologic) cases, validate models and methods, walk through the what if scenarios, back and stress test…
In one of the next posts, I'll come to workflows (sequences of tasks).
The Units of Quant Innovations: Functions
The definition of the structure of software systems is not so clear. The decomposition shall be derived from the problem: data, function or object oriented.
A nested structure
However, in my understanding most of the quant theories suggest functional decomposition. So I look at functions as the smallest units. They create tasks that create workflows that create subsystems that create the system.
What's important to remember is that these units are usually nested.
The most important unit is the task...Value a financial instrument, a portfolio…pick a part in the box and load it into the machine…make an operational plan for certain bending operations…
They're built of (nested) functions.
Special functions
Higher functions
In the vast majority of the cases you need numerical schemes to make the predictions and simulate your models. To model, say, the material flows, chemical reactions, and conservation of energy in, say, a blast furnace you need to solve systems of dozens of coupled nonlinear Partial Differential Equations.
To solve them it is required to decompose the domain (space and time)…especially if you use Finite Element techniques…They're often used many times and it's really important to understand their buildings blocks…If say, the convection part of a reaction-convection-diffusions PDE becomes dominant, you need to introduce some stabilization techniques (upwinding) to avoid the risky horror of spurious accuracy…
There are many of such traps in mathematical problem solving
In one of the coming posts, I will look into tasks…they are usually represented in languages and need various data with a wide range of types...
A nested structure
However, in my understanding most of the quant theories suggest functional decomposition. So I look at functions as the smallest units. They create tasks that create workflows that create subsystems that create the system.
What's important to remember is that these units are usually nested.
The most important unit is the task...Value a financial instrument, a portfolio…pick a part in the box and load it into the machine…make an operational plan for certain bending operations…
They're built of (nested) functions.
Special functions
In mathematics we have special functions, symbols we know by name, meet quite often, know their form, shape and behaviour. From the simple Sin, Cos, Exp, .. to the Meijer G-Function, which is very general.
Some of them are good friends, we can artistically create derivatives, integrals...when they dominate in expressions, equations we use them as Ansatz for, say, solving differential equations.
They're so well known that we don't need to think much about the building blocks…constructor - progressive problems - solution. But they may need some preprocessing and postprocessing.
New special functions expand the domain of closed form solutions. Closed form solutions are elegant, but usually they do approximate only small worlds.
Some of them are good friends, we can artistically create derivatives, integrals...when they dominate in expressions, equations we use them as Ansatz for, say, solving differential equations.
They're so well known that we don't need to think much about the building blocks…constructor - progressive problems - solution. But they may need some preprocessing and postprocessing.
New special functions expand the domain of closed form solutions. Closed form solutions are elegant, but usually they do approximate only small worlds.
Higher functions
In the vast majority of the cases you need numerical schemes to make the predictions and simulate your models. To model, say, the material flows, chemical reactions, and conservation of energy in, say, a blast furnace you need to solve systems of dozens of coupled nonlinear Partial Differential Equations.
To solve them it is required to decompose the domain (space and time)…especially if you use Finite Element techniques…They're often used many times and it's really important to understand their buildings blocks…If say, the convection part of a reaction-convection-diffusions PDE becomes dominant, you need to introduce some stabilization techniques (upwinding) to avoid the risky horror of spurious accuracy…
There are many of such traps in mathematical problem solving
In one of the coming posts, I will look into tasks…they are usually represented in languages and need various data with a wide range of types...
Innovation - Revolution of Heroes or Heroes of Revolution?
I like music from all directions (from John Adams to John Zorn). Who are the innovators in music?
Examples from Jazz.
What was it that so many great musicians played, say, Bebop (Dizzy Gillepie, Charly Parker, Bud Powell, Thelonious Monk, Max Roach...), Free Jazz (Ornette Coleman, John Coltrane, Charles Mingus, Archie Shepp, Cecil Taylor...), Loft Jazz (Anthony Braxton, Arthur Blythe, Julius Hemphill, David Murray, Sam Rivers…) … or those around John Zorn (improvised music, hardcore, klezmer-oriented free jazz…)?
A coincidence of talented artists at a time? Or are artists motivated to join a revolution - and share their best work?
I believe the latter is true.
Technology revolutions make heroes
Paradigm shifts expose the work of those attracted by the change opportunities.
In quant fields, in particular, we need to "lift" methodological concerns to more abstraction and modeling and to a meta-level of domain engineering.
How to integrate requirements engineering and software design? How do we achieve abstraction and modeling, handle more general issues of documentation and the required pragmatics?
By new programming paradigms, multi-model and method approaches, advanced numerical engines, data frameworks for massive data treatment, computable documents…universal deployment services. Build solutions and development systems in one.
And don't forget the building blocks inducing the attractiveness (differential advantage) of your innovation: constructors, progressive problems, solutions in preprocessing, processing, post processing…that your users may need to choose the best bad solution (in a crisis) and that a climax may be the answer to a crisis. Lead them to the solution, but let them choices…
p.s. Wildflowers is my reference recording for Loft Jazz
Examples from Jazz.
What was it that so many great musicians played, say, Bebop (Dizzy Gillepie, Charly Parker, Bud Powell, Thelonious Monk, Max Roach...), Free Jazz (Ornette Coleman, John Coltrane, Charles Mingus, Archie Shepp, Cecil Taylor...), Loft Jazz (Anthony Braxton, Arthur Blythe, Julius Hemphill, David Murray, Sam Rivers…) … or those around John Zorn (improvised music, hardcore, klezmer-oriented free jazz…)?
A coincidence of talented artists at a time? Or are artists motivated to join a revolution - and share their best work?
I believe the latter is true.
Technology revolutions make heroes
Paradigm shifts expose the work of those attracted by the change opportunities.
In quant fields, in particular, we need to "lift" methodological concerns to more abstraction and modeling and to a meta-level of domain engineering.
How to integrate requirements engineering and software design? How do we achieve abstraction and modeling, handle more general issues of documentation and the required pragmatics?
By new programming paradigms, multi-model and method approaches, advanced numerical engines, data frameworks for massive data treatment, computable documents…universal deployment services. Build solutions and development systems in one.
And don't forget the building blocks inducing the attractiveness (differential advantage) of your innovation: constructors, progressive problems, solutions in preprocessing, processing, post processing…that your users may need to choose the best bad solution (in a crisis) and that a climax may be the answer to a crisis. Lead them to the solution, but let them choices…
p.s. Wildflowers is my reference recording for Loft Jazz
Negative Innovation?
This is a term that I found in the cover story of the Jan-15 issue of the Wilmott magazine. It's title is Beyond the Barricades and it's about the recruitment outlook for quants in 2015…advocating for the ability to work and communicate well across disciplines.
Compliance or innovation?
In its beginning hook the article writer states
A climax in progressive problems of quant work
As I've recommended in an earlier post the best bad choice is surfing on the regulatory waves.
But quants may find ways to combine regulation-compliant programming with business-oriented responsibilities, creating new products for new clients…to become hybrid quants doing the industrial and lab work simultaneously.
It's a climax, a moment, when a quant needs to act on her crisis choice. And the choices and actions will tell us a lot about the quant.
And the technologies they use will influence that crisis choice. Does it support the industrial (regulation-compliant) part of the tasks and help developing new things simultaneously?
Compliance or innovation?
In its beginning hook the article writer states
Some might argue, convincingly, that finance as a business never felt the need to capitalize on the early access to quant skills…When an industry can be defined as one that develops on the basis of negative innovation, by which we mean innovation brought about by regulatory requirements alone…where compliance defines the deployment of resources...In this context quants have two choices: working through constrained tasks that are defined by somebody else or figure out what the new opportunities are.
A climax in progressive problems of quant work
As I've recommended in an earlier post the best bad choice is surfing on the regulatory waves.
But quants may find ways to combine regulation-compliant programming with business-oriented responsibilities, creating new products for new clients…to become hybrid quants doing the industrial and lab work simultaneously.
It's a climax, a moment, when a quant needs to act on her crisis choice. And the choices and actions will tell us a lot about the quant.
And the technologies they use will influence that crisis choice. Does it support the industrial (regulation-compliant) part of the tasks and help developing new things simultaneously?
Subscribe to:
Posts (Atom)