Yoter Up logo

Yoter Up

Behavioral Archetypes Toolkit for Assessing Customer Persona

Marcin Grzybek, Senior Embedded Software Engineer
1234 views
Uploaded: August 17, 2024
Ellipse

Autonomous Driving (AD) systems and Advanced Driver Assistance Systems (ADAS) rely on a network of cameras, lidar (Light Detection and Ranging), and radar sensors to perceive the surrounding environment and traffic conditions. Any disruption or defect in these sensors can result in unintended system behavior. For instance, road dust, heavy rain, or snow accumulation can obstruct a sensor’s field of view or degrade its performance. Such blockages may prevent accurate object detection or even cause the system to misidentify non-existent obstacles.

To ensure reliability and safety, these sensors must incorporate self-monitoring mechanisms capable of identifying any drop in perception performance due to occlusion. By continuously monitoring themselves, sensors can detect environmental disturbances and trigger corrective measures. In critical cases, this could involve initiating automatic cleaning processes or temporarily disabling specific system functionalities until optimal operation is restored.

Monitoring the Monitors

To validate that a sensor’s blockage-detection mechanism works correctly, its detection output must be compared against a verified reference, known as the ground truth. But how can we reliably obtain this ground truth?

One approach is to observe the sensor using an external camera during real-world operation. Both the sensor data and the camera footage are recorded simultaneously. Later, engineers analyze the sensor’s blockage detection by comparing its output to the video evidence.

Take the example of a front-mounted lidar sensor. This lidar emits and receives laser beams through its frontal surface, which is internally monitored for occlusions. Engineers attach an additional camera to capture continuous footage of the sensor’s surface. Advanced computer vision algorithms process the recorded videos, automatically detecting and marking areas affected by dirt, ice, or debris. These identified occlusion regions are stored as a dataset, forming the ground truth reference for validating the lidar’s own blockage detection capabilities.

Managing and Processing the Data

Capturing a comprehensive range of scenarios requires hundreds or even thousands of driving hours, generating vast amounts of video data that must be automatically processed, stored, and analyzed. This demand calls for scalable compute resources, high-capacity storage, and flexible automation pipelines.

Yoter Up delivers this through our Robotic Drive-based solution, fully integrated with AWS cloud services for elastic computing and storage. The implementation leverages a carefully optimized subset of Yoter Up Robotic Drive software components, running in the cloud on top of AWS-managed services.

Technical Deep Dive

  1. Data Ingestion. The client provides video recordings in MDF4 format (the automotive industry standard) to a client-owned Amazon S3 bucket, which is automatically synchronized to a Yoter Up S3 bucket.
  2. Workflow Orchestration. The arrival of new files triggers a job in the Robotic Drive Workflow Manager (WFM), powered by AWS Managed Workflow for Apache Airflow (MWAA) to streamline complexity.
  3. Automated Processing. The Robotic Drive Analyzer (RDA) runs in an AWS EKS Kubernetes cluster, processing multiple files in parallel. Cluster size dynamically scales based on workload, leveraging AWS EC2 Spot Instances to reduce operational costs.
  4. Ground Truth Generation. RDA extracts image frames from the video streams and passes them to a Yoter Up computer vision algorithm that identifies sensor blockages. The resulting ground truth datasets are stored in the Yoter Up S3 bucket and automatically synchronized to the client’s S3 bucket.

Benefits for Autonomous Driving Projects

This end-to-end automated pipeline transforms the way automotive clients manage and validate their AD and ADAS data. Teams can simply upload new video recordings to S3 and automatically receive ground-truth results in the same bucket after processing. The solution significantly accelerates AD workflows, reduces manual effort, and ensures faster sensor validation cycles.

A Scalable and Versatile Cloud-Native Solution

This implementation demonstrates how Yoter Up Robotic Drive components, combined with AWS cloud services, enable lightweight, scalable, and business-ready autonomous driving solutions. By leveraging AWS-managed services wherever possible, the solution reduces total cost of ownership, increases operational agility, and allows rapid adaptation to emerging automotive data challenges.

In today’s competitive market, speed and flexibility are strategic advantages. Yoter Up accelerates innovation cycles by integrating fully automated infrastructure provisioning, code-driven deployments, and AI-powered data processing—ensuring your autonomous driving projects remain ahead of the curve.

Yoter Up is ready to help automotive innovators harness their data to deliver safer, smarter, and future-proof autonomous driving solutions.

Let's connect

Send us message and we`ll promptly discuss your project with you
General questions
End-to-end development
Dedicated developers
Career

What`s next?

1
We start by signing an NDA to ensure your ideas are protected.
2
Then our team will analyze your requirements.
3
You get a detailed project outline.
4
We bring your project to life, so you can focus on growing your business.
I consent to the processing of my personal data by Yoter Up sp. z o.o.