Projekt per år
Projektinformation
Beskrivning
The goal is to develop a robotic system that can accurately hand over objects to humans, ensuring precise placement without re-gripping. The project addresses three main challenges: achieving effective scene understanding and situation awareness, accurately recognizing user intentions and high-level reasoning, and teaching and executing collaborative tasks. By focusing on these areas, the project seeks to improve intuitive robot instruction and responsive collaboration in both medical and industrial contexts, ultimately contributing to more efficient workflows and addressing labor shortages in sectors like healthcare and manufacturing.
The project plan focuses on developing methods to determine which objects to hand over, how to place them in the user’s hand, and how to execute these tasks effectively. Initially, the project will use tools like a scalpel and a screwdriver. Hand tracking enables the robot to understand and respond to human gestures, ensuring precise object handovers. This involves the robot interpreting the surgeon’s or worker’s gestures and hand poses to accurately place the tool in their hand without the need for re-gripping. Tool tracking, on the other hand, allows the robot to accurately identify and manipulate these specific tools during collaborative tasks, ensuring that each tool is handed over in the correct orientation and position for immediate use.
By integrating high-level knowledge and explicit commands, the system can predict what tool to hand over and when, based on the context of the task and the workflow. This involves using symbolic domain knowledge and sensor information to understand the user’s intentions and make informed decisions. The system aims to balance personalization with general solutions, learning individual user preferences for
gestures, speed, and force parameters, while also maintaining a level of generality that allows it to adapt to different users and scenarios.
Further steps in the project include evaluating different motion models to enhance the robot’s performance in various scenarios. This will involve testing and refining the robot’s ability to execute smooth and precise movements, ensuring that it can handle unexpected deviations in the workflow and still perform accurate handovers. By focusing on these aspects, the project aims to improve intuitive robot instruction and responsive collaboration in both medical and industrial contexts, ultimately contributing to more efficient workflows.
The project plan focuses on developing methods to determine which objects to hand over, how to place them in the user’s hand, and how to execute these tasks effectively. Initially, the project will use tools like a scalpel and a screwdriver. Hand tracking enables the robot to understand and respond to human gestures, ensuring precise object handovers. This involves the robot interpreting the surgeon’s or worker’s gestures and hand poses to accurately place the tool in their hand without the need for re-gripping. Tool tracking, on the other hand, allows the robot to accurately identify and manipulate these specific tools during collaborative tasks, ensuring that each tool is handed over in the correct orientation and position for immediate use.
By integrating high-level knowledge and explicit commands, the system can predict what tool to hand over and when, based on the context of the task and the workflow. This involves using symbolic domain knowledge and sensor information to understand the user’s intentions and make informed decisions. The system aims to balance personalization with general solutions, learning individual user preferences for
gestures, speed, and force parameters, while also maintaining a level of generality that allows it to adapt to different users and scenarios.
Further steps in the project include evaluating different motion models to enhance the robot’s performance in various scenarios. This will involve testing and refining the robot’s ability to execute smooth and precise movements, ensuring that it can handle unexpected deviations in the workflow and still perform accurate handovers. By focusing on these aspects, the project aims to improve intuitive robot instruction and responsive collaboration in both medical and industrial contexts, ultimately contributing to more efficient workflows.
Status | Pågående |
---|---|
Gällande start-/slutdatum | 2024/04/01 → … |
Ämnesklassifikation (UKÄ)
- Datavetenskap (datalogi)
- Datorseende och robotik (autonoma system)
- Människa-datorinteraktion (interaktionsdesign)
- Programvaruteknik
- Datorteknik
Projekt
- 4 Aktiva
-
CAISA: CAISA - Collaborative Artificial Intelligent Surgical Assistant
Stenmark, M. (Forskare), Nilsson, K. (Forskare), Phan, K. T. (Forskare), Omerbašić, E. (Forskare) & Johnsson, C. (Forskare)
Swedish Government Agency for Innovation Systems (Vinnova)
2024/09/01 → 2027/08/30
Projekt: Forskning
-
Get a Grip - Accurate robot–human handovers
Topp, E. A. (PI) & Stenmark, M. (CoI)
2024/01/01 → …
Projekt: Forskning
-
WASP: Wallenberg AI, Autonomous Systems and Software Program at Lund University
Årzén, K.-E. (Forskare)
2015/10/01 → 2029/12/31
Projekt: Forskning