Test Data PoC

Can't imagine how it lloks like? Evaluate With Your Data

Choosing a test data management platform is a significant decision. You've seen the slides, perhaps watched a demo with sample data. But how do you know it will work with your measurement files, your metadata conventions, your engineering workflows?

 

Generic demonstrations can only show so much. Your data has its own complexity. File formats from different test benches. Metadata scattered across systems. Naming conventions that evolved over years. You need to see the platform handle your reality, not a polished example.

 

A Proof of Concept puts our platform to work with your actual data, your metadata structure, and your analysis workflows. We build importers for your file formats. We design an application model that reflects how your engineers think about tests. You evaluate on your infrastructure, with your team, before making any commitment.

 

At the end, you have concrete evidence for your decision. You know what works, what needs refinement, and what the path to production looks like. No surprises, no assumptions, just practical experience with your own data.

Clarify what you need with a test data PoC

What a PoC Delivers

Working System

Your data imported into a configured AReS or Janus instance. Not sample data. Your measurements, your metadata, your naming conventions. A real system you can explore and test against your requirements.

Real Workflows

Key use cases demonstrated end to end. Search and retrieval. Analysis tool integration. Automated processing with Merlin. Whatever matters most for your evaluation, working with your actual data.

Informed Decision

Hands on experience for your stakeholders. Clear understanding of fit, gaps, and implementation path. When you move to production, you know exactly what you are getting.

What's Included

Typical PoC Scope

A PoC validates that our platform works for your specific situation. We focus on enough scope to answer your key questions without building a full production system.

Typical inclusions:

  • Data import for two to three priority sources, for example MDF files, ATFx archives, test bench connections, or legacy databases.
  • Application model designed for your domain, including test campaigns, configurations, and measurement metadata.
  • ASAMCommander configured for your workflows, including navigation, search configurations, and visualization aligned to your use cases.
  • Basic HQL training so your team can query data independently.

Optional Additions

Depending on your priorities, we can extend the PoC scope:

  • Merlin automation with one or two example pipelines to show that your Python or MATLAB scripts can run automatically, triggered by data imports.
  • Analysis tool integration, including connection to DIAdem, Concerto, or other tools your engineers use daily.
  • Custom importer logic for complex data assembly, validation rules, or metadata enrichment from external APIs.

What is typically not included:

  • Security and authentication integration.
  • Full UI customization.
  • Performance optimization and multi site deployment.

How It Works

Discovery Workshop

We start with a half day session to understand your situation. Pain points, priority data sources, and what success looks like. You prepare sample data packages and metadata requirements. We align on scope and success criteria.

Build Phase

We develop the application model, build importers for your data sources, and configure ASAMCommander for your workflows. Typical build time is four to six weeks depending on complexity. You receive progress updates and can adjust priorities as we learn more about your data.

Evaluation Period

We deliver a complete working environment to your infrastructure, including server, web application, importers, and documentation. Your team evaluates for one to three months with our support. At the end, you have concrete experience to inform your decision.

What We Need From You

Sample Data

Provide representative test data packages from each source you want to include. Typically ten packages per source type is enough to validate the import approach. A test data package includes the measurement file itself plus any associated metadata. Data can be anonymized if needed. The structure, format, and file sizes should reflect real conditions.

Metadata and Catalogs

Help us understand how you describe your test data. Which attributes matter for search and organization. How you categorize tests, equipment, and results. If you have existing catalogs for units, quantities, and channel names, share them. If not, we can work from your measurement files and build catalogs together. Think about navigation. How do your engineers find data today, by project, by date, or by test type. This shapes how we configure the system.

After the PoC

If It Fits

Move to production implementation with confidence. The application model, importers, and configurations from the PoC become the foundation. No starting over. Your team already knows the system.

If It Needs Work

We identify gaps clearly. Certain data sources may need more complex handling. Specific workflows may require custom development. You know exactly what additional effort is needed.

If It Doesn't Fit

You learned that early, before a major investment. Sometimes requirements do not align with our platform strengths. Better to know now than after a full deployment.

Test Data PoC Proof-of-concept test data management

HighQSoft GmbH

Black-und-Decker-Straße 17b
D-65510 Idstein