Sõpruse pst, 10615, Tallinn, Estonia +372-55650441

Announcements

No announcements yet

New blog posts

Completion of Workshop on Water Recling Simulation and Modelling: Unlocking the Future of Water Management
Completion of Workshop on Water Recling Simulation and Modelling: Unlocking the Future of Water Management

19 March, 2024 by Charlotte Lee

We are thrilled to announce the successful...

IJITIS Journal Meeting and SWOT Analysis at TULTECH
IJITIS Journal Meeting and SWOT Analysis at TULTECH

15 January, 2024 by Charlotte Lee

Greetings, TULTECH community! In our...

A Milestone Meeting for EIL: Shaping the Future of Environmental Industry Letters
A Milestone Meeting for EIL: Shaping the Future of Environmental Industry Letters

15 December, 2023 by Charlotte Lee

Dear TULTECH Community, We are delighted to...

View all blog entries →

Journals

Weather

Clouds

3°C

Clouds in Tallinn

Calendar of Events

Closest Events
All events on this day

Enhanced 3D vision helped a four-legged robot navigate bumpy terrain

Posted on 3 July, 2023 by benyamin chahkandi

Enhanced 3D vision helped a four-legged robot navigate bumpy terrain

Summary: A novel model has been created by researchers at the University of California, San Diego, that teaches four-legged robots to see in three dimensions more clearly. The development made it possible for a robot to easily and autonomously navigate difficult terrain, including rocky ground, stairways, and paths with gaps. 

The scientists will present their findings at the Conference on Computer Vision and Pattern Recognition (CVPR), which will be held in Vancouver, Canada, from June 18 to June 22, 2023.

According to research senior author Xiaolong Wang, a professor of electrical and computer engineering at the UC San Diego Jacobs School of Engineering, "by giving the robot a better understanding of its surroundings in 3D, it can be deployed in more complex environments in the real world."


A forward-facing depth camera is mounted on the robot's head. A downward tilt of the camera offers it a clear view of both the area in front of it and the ground below it.

The researchers created a model that converts 2D photos from the camera into 3D space in order to enhance the robot's 3D vision. It accomplishes this by examining a brief video sequence made up of the current frame and a few prior frames, then separating 3D data from each 2D frame. This gives details regarding the robot's leg movements, including joint angle, joint velocity, and height above the ground. To figure out the 3D transformation among the past and the present, the model compares data from the prior frames with data from the current frame.


All of this data is combined by the model, enabling it to synthesise earlier frames based on the current frame. The model compares the synthesised frames to the frames that the camera has already recorded as the robot moves. The model knows it has learned the right representation of the 3D scene if they are a good match. If not, it makes adjustments until it is perfect.

The mobility of the robot is managed using the 3D model. The robot can remember what it has seen and the previous motions its legs have made by synthesising visual information from the past and using that knowledge to guide its current actions.

According to Wang, "Our method enables the robot to create a short-term memory of its 3D surroundings so that it can act better."

The new study expands on the team's earlier research, which created algorithms to enable a four-legged robot to walk and run on uneven ground while avoiding obstacles. Proprioception is a sensory system that includes the senses of movement, direction, speed, location, and touch. The innovation here is that the researchers demonstrate that the robot can now navigate more difficult terrain than before by enhancing its 3D perception (and integrating it with proprioception).

What's exciting, according to Wang, is that we have created a single model that can manage several demanding conditions. This is because the robot is now more adaptable to a variety of settings because to our improved grasp of the 3D environment.

The strategy does have certain drawbacks, though. Wang points out that their current design does not direct the robot towards a particular objective or location. When deployed, the robot simply walks in a straight line, dodging obstacles by moving on to another straight line. He claimed that "the robot does not control exactly where it goes." "We would like to add more planning strategies and finish the navigation pipeline in future work," the statement continued.
The title of the study is "Neural Volumetric Memory for Visual Locomotion Control." Ruihan Yang, from UC San Diego, and Ge Yang, from Massachusetts Institute of Technology, are co-authors.

The National Science Foundation (CCF-2112665, IIS-2240014, 1730158, and ACI-1541349), an Amazon Research Award, and contributions from Qualcomm all contributed to the funding of this work.

source: www.sciencedaily.com/releases/2023/06/230612200423.htm


Today In History

Here are some interesting facts ih history happened on 26 January.

  1. Isaac Newton receives Jean Bernoulli's 6 month time limit - solves problem before going to bed that same night
  2. 1st settlement established by English in Australia (Sydney)
  3. Mich admitted as 26th US state
  4. Hong Kong was proclaimed a sovereign territory of Britain
  5. La becomes 6th state to secede from US
  6. American income tax repealed
  7. Gen Gordon & troops slain by Sudanese in Khartoum
  8. World's largest diamond found Cullinan diamond
  9. Indian Republic Day
  10. India becomes a republic ceaseing to be a British dominion
  11. Ground breaking begins on Disneyland
  12. 1st woman `personal physician to President' - JG Travell
  13. Israel opens "Good Fence" to Lebanon
  14. Nelson Rockefeller former VP & 4 time governor of NY died
  15. Islanders & Whalers had a penality free game
  16. Islanders score 4 goals within 1:38 5 within 2:37 vs Penguins
  17. Nordiques' Michel Goulet scored on 9th penalty shot against Islanders
  18. Chicago Bears defeat Patriots 46-10 in Super Bowl 20