Ar-Control (AKINROBOTICS Control Software)

It provides AR-Core's access to subunits (embedded software, image processing, speech, and detection) located on the Robot. It provides a synchronous management and control of all limbs simultaneously by bridging the embedded software with Ar-Core.The image processing algorithms it contains simultaneously processes the image on the video card and gives real-time results under human reaction.They perform human detection, recognition and tracking, object detection, recognition and tracking.In this way, the robot recognizes people and keeps them in the database to share with other robots.By measuring the light intensity of the environment, the robot creates the dark - light concepts and describes the direction in real time by processing the map data of the environment with the image processing algorithms.In this way, it prevents difficulty of way making such as the banned areas and walking paths on the map done by humans and it can be done in seconds.


Thanks to speech detection and voice synthesis algorithms, it can dialogue with people.The sounds perceived from the external environment are converted into text by first passing through the speech detection algorithm.This created text is processed and the concepts such as commands, questions, subjects, objects, and directions in the sentence are determined and forwarded to Ar-Core.The system provides the robot to speak by transmitting the text that is processed from Ar-Core to the sound card via voice synthesis algorithms.