Asian Surveying & Mapping
Breaking News
Australia and UK tie up for navigation without GNSS
Airborne navigation without GNSS is the aim of a...
Market size, technical gaps threaten Taiwan’s space dream
As global players like SpaceX dominate headlines, Taiwan is...
China quietly tested its first inflatable space module in orbit
BREMEN, Germany — China tested out a small expandable...
Isro Signs Agreement with Australian Space Agency for Gaganyaan Mission
Isro Signs Agreement with Australian Space Agency for Gaganyaan...
ESA and Japan expand collaboration in space exploration
The European Space Agency (ESA) and the Japan Aerospace...
Korea’s space development to hit wall without private sector initiative
The private sector should take the initiative in South...
India’s Chandrayaan-2 moon orbiter avoids collision with South Korea’s Danuri spacecraft
India's Chandrayaan-2 moon orbiter maneuvered in September to avoid...
Dubai residents may soon have opportunity to travel to the edge of space
Residents of Dubai may soon have the chance to...
Swift Navigation and KDDI Launch Skylark Precise Positioning Service Across Japan
SAN FRANCISCO and TOKYO - Swift Navigation, a global...
Thailand plays host to Southeast Asia’s largest space technology event
Thailand Space Week 2024, the region’s premier international space...

September 21st, 2021
World’s largest open dataset for the development of self-driving vehicles launched

Self-driving vehicles such as cars, ships and drones offer the potential for reduced costs, lower environmental impacts and fewer accidents. Now, a new open dataset from researchers at Chalmers University of Technology, Sweden, sets a new standard for evaluating the algorithms of such vehicles, and the development of autonomous transport systems on roads, water and in the air.

For self-driving vehicles to work, they need to interpret and understand their surroundings. To achieve this, they use cameras, sensors, radar and other equipment, to ‘see’ their environment. This form of artificial perception allows them to adapt their speed and steering, in a way similar to how human drivers react to changing conditions in their surroundings. In recent years, researchers and companies around the world have competed over which software algorithms provide the best artificial perception. To help, they use huge datasets which contain recorded sequences from traffic environments and other situations. These datasets are used to verify that the algorithms work as well as possible and that they are interpreting situations correctly.

Open data for researchers and specialists

Now, Chalmers University of Technology, Sweden, is launching a new open dataset called Reeds, in collaboration with the University of Gothenburg, Rise (Research Institutes of Sweden), and the Swedish Maritime Administration, which is now available to researchers and industry worldwide.

The Reeds dataset and more information can be found here.

The dataset provides recorded surroundings of the test vehicle of the highest quality and accuracy. In order to create the most challenging conditions possible – and thus increase the complexity of the software algorithms – the researchers chose to use a boat, where movements relative to the surroundings are more complex than for vehicles on land. This means that Reeds is the first marine dataset of this type.

Ola Benderius, Associate Professor at the Department of Mechanics and Maritime Sciences at Chalmers University of Technology, is leading the project. He hopes the dataset will represent a breakthrough for more accurate verification to increase the quality of artificial perception.

“The goal is to set a standard for development and evaluation of tomorrow’s fully autonomous systems. With Reeds, we are creating a dataset of the highest possible quality, that offers great social benefit and safer systems.”

The dataset has been developed using an advanced research boat that travels predetermined routes around western Sweden, under different weather and light conditions. The tours will continue for another three years and the dataset will thus grow over time. The boat is equipped with highly advanced cameras, laser scanners, radar, motion sensors and positioning systems, to create a comprehensive picture of the environment around the craft.

The highest technical standards to open doors to advanced AI

The camera system on the boat contains the latest in camera technology, generating 6 gigabytes of image data per second. A 1.5-hour trip thus provides 16 terabytes of image data – significantly more than what has been presented so far in competing datasets. It also provides far better conditions for verification of artificial perception in the future.

“Our system is of a very high technical standard. It allows for a more detailed verification and comparison between different software algorithms for artificial perception – a crucial foundation for AI,” says Ola Benderius.

During the project, Reeds has been tested and further developed by other researchers at Chalmers, as well as specially invited international researchers. They have worked with automatic recognition and classification of other vessels, measuring their own ship’s movements based on camera data, 3D modeling of the environment and AI-based removal of water droplets from camera lenses.

Reeds contributes to both cooperation and competition

Reeds also provides the conditions for fair comparisons between different researchers’ software algorithms. The researcher uploads their software to Reeds’ cloud service, where the evaluation of data and comparison with other groups’ software takes place completely automatically. The results of the comparisons are published openly, so anyone can see which researchers around the world have developed the best methods of artificial perception in different areas. This means that large amounts of raw data will gradually accumulate and the data will be analysed continuously and automatically in the cloud service. Reeds’ cloud service thus provides the conditions for both collaboration and competition between research groups,  meaning that over time artificial perception will increase in complexity for all types of self-driving systems.

Image: The three GNSS antennas that measure the boat’s position and direction with very high accuracy Also visible is the radar, which provides a complete radar view around the entire boat.

More about the research project

The project began in 2020 and has been run by Chalmers University of Technology in collaboration with the University of Gothenburg, Rise and the Swedish Maritime Administration. The Swedish Transport Administration is funding the project.

In 2019, the researchers in the Reeds project published an article describing how a high-quality dataset and a platform for evaluating algorithms could be realised.

(Alternative link available here)
Read a preprint of a forthcoming scientific article describing the technical logging platform and how data from Reeds can be used.
For more information, contact:

Ola Benderius, Associate Professor at Mechanics and Maritime Sciences, Division of Vehicle Technology and Autonomous Systems, Chalmers University of Technology

+46 31-7722086
[email protected]

Joshua Worth
Press officer
+46-31-772 6379
[email protected]

________________

Chalmers University of Technology in Gothenburg, Sweden, conducts research and education in technology and natural sciences at a high international level. The university has 3100 employees and 10,000 students, and offers education in engineering, science, shipping and architecture.

With scientific excellence as a basis, Chalmers promotes knowledge and technical solutions for a sustainable world. Through global commitment and entrepreneurship, we foster an innovative spirit, in close collaboration with wider society.The EU’s biggest research initiative – the Graphene Flagship – is coordinated by Chalmers. We are also leading the development of a Swedish quantum computer.

Chalmers was founded in 1829 and has the same motto today as it did then: Avancez – forward.