Test Data Management for Larger Enterprises
While our AReS ODS 6 Server is a single full-fledged ASAM ODS server, there is a growing demand for business APIs, integration of analysis functionality, incorporation of alternative data repositories such as Elastic Search or AWS, or scalability of services via containerization, or other.
Our Plankton Ecosystem is a microservices platform to bring that flexibility and performance to ASAM ODS-based Test Data Management systems. The architecture is developed for a distributed test data management system that is accessible by single node entry points.
Products within the Plankton Ecosystem are the HQL Gateway and AReS Gateway, for example, allow an easy approach to scale the backend from a single node system to a multi-node enterprise system with load-balancing and fail-safe features.
Furthermore, additional services can be added to fulfill customized use-cases such as reading from alternative data sources or provide analysis results. The services can be combined with a Plankton Business Gateway being a global enterprise access point for Test Data Management.
The IIOP Gateway is a generic bridge component that allows “ASAM ODS 5”-based applications to access ASAM ODS 6 servers transparently. The gateway behaves as a full-fledged ASAM ODS 5 Server towards client applications while representing a regular ASAM ODS 6 client to the ASAM ODS 6 Server.
The HQL Gateway is designed as a scalable single note entry point to enterprise Test Data repositories. All queries from different applications, locations, and for different projects can run through one node while being distributed to a horizontally scalable backend.
The AReS Gateway works similarly to the HQL Gateway. It operates as a standardized gateway to all testing data within an enterprise. Each part of the AReS Gateway runs in a separate process enabling multiple ways to scale the solution. The AReS Gateway natively runs in a Docker and Kubernetes environment.
Initially, the Plankton EcoSystem provides ASAM ODS microservices that register via Plankton NxRegister classes (e.g. Eureka, Kubernetes Service). Then, the services are accessible via Plankton NxGateway aggregations such as the IIOP Gateway, HQL Gateway, AReS Gateway, or others.
In contradiction to standard microservices, Plankton microservices can run as web services and as OSGi services. This enables horizontal scalability by using the services as a standalone process in physical, virtual, or container-based hosts. Vertical scalability is achieved by using web services within an OSGi engine (for example Felix, Karaf, Equinox).
The example below shows a distribution of web services to standalone applications in high-power computation machines (local system) and multiple clusters and clouds (Kubernetes cluster, and AWS Cloud, for example). Plankton allows any combination of distribution, project setup with various data sources, and scaled microservices. Mostly, the architecture used depends on the project requirements.
Plankton is the Ecosystem that incorporates all services, registries, and gateways. Further information can be found on the AReS Gateway page.
Benefits of a microservices architecture
Microservices are playing an important role in the world of business when flexibility and performance are needed. In comparison to a monolithic application, a microservices architecture such as Plankton EcoSystem requires a sophisticated planning and development phase. Benefits are:
ASAM ODS-based Test Data Management system has never been that flexible and scalable.
Independency of microservices
One of the features and advantages of microservices is one of its characteristics: The independence from multiple standpoints. Microservices are independent of the programming language a client is written in, can be chosen freely. Also, and hypothetically, each microservice could be implemented in a different programming language. Additionally, within a microservices architecture, each microservice is allowed to call other microservices.
At HighQSoft, we are using an OSGi- and JAVA-based backend and a vue.js frontend.
Scalability with Docker and Kubernetes
When microservices are packaged in a Docker container, rapid deployment of applications is possible. In recent years, the toolchain around Kubernetes and Docker has been developed to support this process.
Kubernetes (K8s) provides a comparatively simple and well-supported deployment of microservices in today's IT environments. Also, K8s offers multiple convenient tools and applications to manage even hundreds of containers or services: Balancing, scaling, and monitoring peak