Our ASAMCommander is a modular web application for ASAM ODS-based Test Data Management systems (TDMs). Out-of-the-box, it provides basic features for end-users, IT administrators, and analysts. It works with any data model and thus can be installed for any domain and use case. ASAMCommander provides a good starting point for Proofs-Of-Concept or customized enterprise solutions.
The basic ASAMCommander is an application for engineers and end-users that are interested in exploring measured data. Features provide an extended data navigator to browse and view data according to the user's preferences. Multiple types of search allow quick entry points to the data. An integrated HQL query console provides analysts with a quick research option. Helpful features such as bookmarks, exports, and integration of third-party tools round up the experience.
The ASAMCommander: System Administration module provides features to monitor the system, gain insights into system statistics, and keep informed about the system's health status. A logging module assists in troubleshooting the various parts of the system including third-party tool integrations.
Being developed on our Manatee Platform, ASAMCommander is an application, which can be customized and extended in scope flexibly. New modules, features, and workflows are easy to add or integrate.
To assist IT administrators being in charge of larger enterprise Test Data Management systems, we added a monitoring module to our web application. We keep all vital information about AReS Libertas ODS Platform or our AReS ODS Server in one place: One or multiple instances can be managed, monitored, and their health status supervised.
The Monitoring comes with single page features for each ODS server in the project environment. With our new Plankton NXMonitoring service, we can add any customer service of interest to this application, provided that service can deliver the requested information.
The server management gives a quick overview of all registered ODS Servers in the project. For each server, the health status is displayed and the functionality to start, stop, or restart single instances is available. The server management is suited for IT administrators and helps them to know more about the health status of the system.
Session management is a useful tool that allows IT administrators to monitor sessions within the system. This could be the development of the number of sessions over time, peak times of system utilization, or a balancing problem when too many sessions are used for one server. Also, single sessions can be terminated if, for example, a developed algorithm by an engineer is stuck in the system or a session wasn’t closed properly.
Our servers/platforms usually run in a “mixed-mode” with mounted drives for storage of the mass data. Data recorded tends to increase and storage available to decrease. This feature monitors mounted drives and sends warnings when thresholds are crossed.
The computation monitor is a useful feature to identify the non-expected behavior of the system. It gives an indication when either the servers are over-exhausted in computation, which could be derived from too many sessions or problems within an analysis. Also, it can indicate how much computation power the system uses and has available.
The memory monitor is a very useful tool to identify non-expected behavior of the system over time. If memory increases over time and the behavior cannot be matched to the user activity, then something is wrong. This features targets identifying memory leaks that are produced by wrong session handling, for example.
Looking for log files on Linux machines in restricted IT environments is a complex task. Parsing the information could be more convenient. Extracting and forwarding the information could be easier. All information is in place and can be accessed within one page making analysis much easier.
With the ASAMCommander Logger module, we integrated our system components with a Plankton NxLogger service. We can collect all logging information from our products and provide a consolidated log information viewer of the whole system. Also, the log viewer can be extended to include information from other systems and services, if needed.
The log viewer allows simplified investigations for errors, warnings, and information events. It provides all information to browse, filter, and extract the information needed for a registered service or session. This feature targets IT administrators to resolve system issues or investigate IT tickets.
Trace analysis is a necessity when investigating user sessions in an enterprise system. Instead of investigating multiple log files and combining all information for investigations, it would be easier to have everything at hand. The Logging module tracks the information from all our software including AReS ODS Server, Ares Libertas, HQL, and our Gateways.
When log information is analyzed but the root cause for the problem cannot be identified or requires expert knowledge, help is needed. For this reason, we provide a simple tool to extract the required information as a file so it can be sent to us. This simplifies identifying the right information and the time to assemble it.
Data Onboarding for Test Data Management systems usually is a fully automated process that, once established in process and implementation, runs in the background. This ASAMCommander module targets at making the importer process more transparent by providing further information and solving the following two use-cases: Manual data uploads that are supported by a UI and a separate process to validate new importer versions or investigate failed imports in a production system.
While the manual upload of test data packages is more of an end-user use case, the validation of imports is important but time-consuming. It is necessary to do so because, over time, test data packages may change due to software updates in the acquisitions systems, file format updates, or a change in the recorded data.
Therefore, this Data Onboarding module simplifies the overall handling of data imports and reduces efforts for re-occuring IT administration efforts.
Test Data Packages can manually be uploaded to the system. By identifying the right location for the data based on customized user input, the importer implementation and its configuration are executed with the provided test data package upload.
This process allows an individual upload of test data packages. Upon the created logging information, investigations about any errors can easily be isolated, identified, and processed. Error logs can easily be forwarded for bug-fixing or importer enhancements.
Each importer can be connected to monitoring and logging services from our Plankton platform to gather information about importers in production environments. Next to the information provided, importer-specific statistics are collected, for example, average runtimes.