Data Scientist
Data Adquisition Systems, S.A.
Adam Leno
Rangel Isaías Alvarado Walles is a multidisciplinary Robotics and AI Engineer with deep expertise in embedded systems, autonomous navigation, computer vision, reinforcement learning, and cloud-to-edge AI applications. He holds a Master’s degree (in progress) in Digital Electronics and Automation from the Universidad Tecnológica de Panamá, with specialization in robotics modeling, machine learning, and computer vision.
Currently at Grupo LAFISE, he leads AI and data science initiatives, having developed and deployed advanced models for
credit risk, customer reactivation, and resume analysis using GenAI and RAG strategies. He has also built real-time dashboards and full-stack ML pipelines using AWS SageMaker and Quicksight.
With over 15 years of experience, Rangel has contributed to projects ranging from ROS-based robotic navigation systems to autonomous vehicles, FPGA design, and IIoT deployments. He’s proficient in tools like OpenCV, PyTorch, TensorFlow, FreeRTOS, and microcontroller programming, and has published multiple robotics projects, including full-stack robot simulations and perception pipelines.
He actively explores cutting-edge fields like Embodied AI, VLMs, Multi-Agent Systems, and RL for humanoid and mobile platforms, maintaining a research-oriented mindset backed by hands-on deployment experience across industry and academia.
Session: Visual Perception for Autonomous Navigation
Visual perception for autonomous navigation enables autonomous systems, vehicles or any robot to understand their environment using cameras and depth sensors. Key tasks include object detection, semantic segmentation, depth estimation, and visual SLAM to localize and map surroundings. These visual cues help in planning safe paths, avoiding obstacles, and making navigation decisions. Techniques like CNNs and sensor fusion enhance robustness in dynamic or uncertain environments. Visual perception must operate in real time, adapting to changes in lighting, terrain, and scene complexity. It is essential for enabling autonomy in diverse applications such as self-driving cars, drones, and mobile robots in indoor or outdoor settings.
