How do you teach software systems to sense their surroundings? This is the big question that drives the development of perception systems. When engineers set out to build powerful perception solutions, they have to have a firm grasp of state-of-the-art engineering practices. We have a handle on these methods. Count on us to help you develop a perception system that gets the job done for rapid manufacturing, autonomous robots and vehicles tasked to interpret complex situations – or whatever else your use case may entail.
Everything we do revolves around understanding your unique needs and delivering tailored solutions to satisfy those demands. Environment perception, machine vision, sensor fusion, sensor services – we are here to help you with any and all of this.
Environment Perception
Environment perception is all about using sensors and algorithms to accurately sense and interpret surroundings. With our 3D mapping, localization, object recognition, and tracking tools, we enable excellent planning and navigation for autonomous systems. On top of that, we offer 3DTM, a framework for automated responses to obstacles and surfaces. It also serves to analyze and categorize objects. This makes autonomous systems safer and allows engineers to add on innovative features.
Advanced machine vision systems can boost efficiency in manufacturing. We deliver custom solutions built to meet your needs. Factory virtualization, object recognition, object localization, pick-and-place robot control, 3D reconstruction, fully automated 2D and 3D visual inspection – anything goes. Whatever the job may entail, from executing standard tasks without utmost efficiency to designing, implementing, and testing innovative solutions for highly complex challenges, count on us get the job done. Take advantage of our painstaking attention to detail. Let us team up to analyze and automate production processes to boost their reliability, speed, and cost-efficiency.
Sensor fusion is all about merging data from multiple sensors to obtain an accurate picture of the environment. Cameras, radar, LiDAR, ultrasound, GPS/IMU, and other instruments serve to take diverse readings. One sensor’s strengths compensates for another’s weaknesses, so the fused data sourced from these disparate sensors improves accuracy, reliability, and coverage. It is possible to fuse raw data as well as preprocessed information, for example, about detected objects. Sensor fusion figures prominently in autonomous driving, robotics, surveillance, and medical device use cases.
Sensor Services
We provide a wide range of sensor-related services. Come to us if you need help selecting, sourcing, and calibrating the right sensors for the given task. Our engineers know how to integrate these assets into legacy systems as well as collect and analyze data. We are also your go-to source for data processing algorithms for cameras, radar, ultrasound, infrared, LiDAR, and other sensors. And we will be happy to provide custom sensor and environment simulations (iVESS) to support your design efforts.
ITK Engineering – your go-to partner for perception systems
Implementation skills
Drawing on a deep well of knowledge and experience with computer vision and deep learning, we combine tried-and-true approaches with cutting-edge methods to put your solution into practice. Our team is here to help you implement your new perception system or improve the one you have now. Let us put the power of our data-driven development practices and state-of-the-art software engineering methods to work for you. Rest assured, our extensive experience with embedded, edge, and cloud technologies will benefit your business.
Simulation expertise
Making the most of advanced simulation methods, we design, develop, and test systems to fit your needs. Industry-leading simulation tools and our proprietary iVESS framework serve to simulate sensors and their surroundings. We have the assets needed to effectively resolve every issue. Call on us if you need assistance selecting sensors, sourcing labeled synthetic training data, or verifying and validating critical scenarios.
Tailored solutions
Our engineering team understands the challenges, technologies, and standards our customers have to contend with in their industries. And we can port this knowledge and our best practices from one line of business to another to benefit your company. Count on us to tailor our collaboration model to provide precisely the support you need. Let us team up to brainstorm a concept, develop the solution, and deploy your perception system to your best benefit.
A look at our reference projects:
Boosting efficiency in end-of-line quality assurance with automated visual inspections
The challenge: Tasked to deliver a visual end-of-line inspection system, our engineers faced some towering challenges: fast cycle times, rigorous requirements for accuracy, and components that come in many variants. They also had to integrate the visual inspection system into the legacy production line. On top of that, the harsh conditions of an industrial environment make it very difficult to conduct robust inspections.
Our solution: First we joined forces with the customer to specify the objectives and requirements. To this end, we described the inspection features, component variants, and the requirements for integrating the system with the legacy equipment. Taking a data-driven, iterative approach, we developed evaluation algorithms for the component variants and then built a retrofittable vision system. This enabled us to respond flexibly and collaboratively to changing conditions and emerging uncertainties. To support the production process, we designed a user interface to visualize anomalous features and added a data storage system to the legacy IT to archive the inspection history. The cycle time requirements were met by scanning the rotating component and processing it on the fly with streaming algorithms. Our engineers assessed and implemented individual algorithms for each feature to meet the rigorous demands for precision. With the benefit of our powerful data preprocessing, the system is able to hold up well and operate robustly despite vibrations, adverse light conditions, and other interference factors.
Added value for the customer: The new visual inspection system enabled the customer to automate end-of-line quality assurance. It also boosts efficiency and improves accuracy to give the company a competitive edge. This versatile, reusable system has since been retrofitted to 25 legacy machines in various working environments and countries.
Development support for ADAS perception
The challenge: Provide enough power for real-time data processing. Cope with limited storage capacity. Adapt the system to the target architecture. Develop the software in compliance with ASI. Our engineering team certainly faced no shortage of challenges when it set out to develop a real-time sensor system. This complex system would have to be sophisticated enough to process inputs from sensors and components of varying quality while satisfying a host of very diverse requirements.
Our solution: We developed a perception system to sense static objects such as lane markings. It can also reproject 2D information into a 3D lane detector model. We also delivered algorithms to enable the vehicle to detect, classify, and track dynamic objects. The tracking algorithm is able to precisely follow movements and predict objects’ future position. For the finishing touch, we optimized the deep learning models for the target platform, tested and fine-tuned the various versions of the model on the target hardware, and integrated multi-task models.
Added value for the customer: Pairing proven computer vision algorithms with state-of-the-art AI, we delivered an embedded system meets the project’s complex requirements. Our great experience developing production-ready, real-time perception systems benefits the customer today and for many tomorrow’s to come: This versatile, future-proof solution features fusion algorithms that can be used for different sensor technologies.
Consulting on a perceptionsystem
The challenge: Human-machine interaction has its benefits, but some situations entail a measure of risk. Our engineers teamed up with an employers’ liability insurance association to develop and implement a sensor-based safety concept to prevent accidents in a star batcher. The top priorities were to identify and define critical situations, incorporate standards, and tailor a sensor and alert/warning system to fit.
Our solution: The team started at the top of the to-do list by defining critical situations and identifying the various safety zones. With this information in mind, it then designed a bespoke camera-based sensor system and selected the right components for it. To support these initial steps, it developed a simulation to efficiently test different scenarios, sensor positions, and sensor characteristics. The team first captured the necessary data and then built a deep learning system for detecting people and objects. It added an MLOps pipeline featuring various tools and automated processes to ensure reproducibility and traceability all the way back to the source data. Our proprietary tool tracks detections to check if they are within the defined safety zone. Visual and audible alerts are issued automatically depending on the position within this detection zone.
Added value for the customer: The team designed and delivered a safety concept to automatically detect people and objects in the danger zone. Alerts that can be seen and heard are issued to call attention to hazardous situations. This reduces the risk of accidents in the safety zone to minimize the potential for personal injury and property damage. Tasks and processes have been largely automated in the course of this development effort. Now detection models can be improved on the fly while components remain easy to scale and reuse.
Key Takeaways
Data-driven development and IP transfer
Innovative AI solutions & algorithms
Industry-specific norms & standards
Unsolved challenges? We look forward to your inquiry.