Today still, a lot of analysis is done in person. Engineers load recorded data with fat client analysis tools and work their way through their analysis. With a higher number of analyses, large data sets, and global enterprise-scale systems the approach will come to a limit.
Thus, many engineers in the domain face the challenge of losing time and resources by manually developing and executing their analyses. Results may get created redundantly and may not be comparable due to variations in algorithms. The worst case is, results are none-reproducible and insights are lost.
Our Merlin Analysis Server 2G is a second-generation and technology-independent analysis framework that integrates server-side analysis of captured data and hence streamlines the task of generating analysis results. It allows engineers to keep track of progress with testing and derive analysis results quickly by bringing the algorithm to the data and integrating it into automated workflows.
While engineers may require expert tools such as Matlab to perform individual analysis in person, there is a great demand to standardize repetitive analysis as well as report generation to save time and efforts of the engineers. Also, use-cases that involve analysis of a vehicle fleet or a higher number of tests may be unsuitable for desktop analysis.
This is where Merlin helps: Merlin integrates the task of analysis execution into the processes of the Test Data Management system by taking it from out of the hands of the individual engineer to a server-side framework. While the analysis packages can still be administrated by the engineer, the execution is managed and results become standardized - comparable and reproducible.
Merlin is a technology-independent framework to allow all types of algorithms. Merlin integrates algorithms and puts them into context with the data. The execution of the algorithm is still done by its native environment, e.g.
Merlin integrates algorithms and their runtime into the context of the Test Data Management system. Under Merlin's control, tasks such as obtaining correct data sets from the Test Data Management system become resolved. Furthermore, this accounts for the orchestration of the execution including queuing, scheduling, and event management.
Therefore, automated triggers from the ModelMapper (the data importer) or the Notification Server (Avalon ODS Server, Ares Server) may initiate analysis upon new test data availability or e.g. information updates. Of course, analysis can also be triggered manually by an end-user application such as the Test Data Management System (Manatee).
Merlin Analysis Server integrates user-developed algorithms and executes them with data from the Test Data Management system. Merlin as an engine connect algorithms with parameterization and configuration, input data (e.g. measurement IDs), and moves the package to an executor. The executor might be a JAVA or Python runtime, a Matlab runtime, or SPARK - depending on the job.
Merlin can be triggered by user input (web-application), the postman (ad-hoc), or by other services of the Test Data Management system such as the ModelMapper Importer, AReS Server or Avalon Server. The administrator may add, change or remove analysis programs, the end-user may only trigger them or consume their results.
Merlin is rather a framework than a runtime, thus outsources the execution of algorithms (see freedom of algorithms). Thus, the resources of multiple machines can be managed by Merlin allowing the system to scale with demand. While operating server-side and somewhere next to the Avalon ODS Server or Ares Server and the data, Merlin brings the algorithm close to the data. Hence, it reduces network times and load. When integrated into SPARK / Hadoop, algorithms a brought to the data.
Security matters remain handled by the Avalon ODS Server or Ares ODS 6 Server.
Results sets are many-folded in content and do not necessarily remain part of the Test Data Management system. Generally, results maybe whatever the developer wants them to be.
Merlin can handle items such as new measurements or channel calculations, document generation to be attached to the right entity, calculations for graphical representations, single calculated values or statistics, sending emails, or other.
A specific analysis that runs on measurements is the identification of Events and Key-Performance-Indicators (KPIs). Events have a begin and end marker. These are derived based on described occurrences within the data. KPIs are usually a single value result type that is calculated within the begin and end marker of the event. An Event can have multiple KPIs.
Definitions of Events and KPIs can be managed by the engineer in the same way that algorithms for the markers or the KPIs can be added. E.g. by providing a Matlab script to calculate an aggregation of data.
Results of this type are usually provisioned in JSON format to be stored in Elastic Search or other preferred indexing services.