We have applied for the world cup of RoboCup, which will be in Sydney this year. We have written a Team Description Paper for this purpose, in which we are presenting our research of the last year. We have also made a video for our application. We are very happy to have been accepted and to be able to fly to Sydney in juli and for the exciting games we will get to play.
We are already preparing a lot for the world cup. We have had multiple integration tests in the last few months. In the integration tests we have tested all of our hard- and software. We are also already planning the next integration test.
By changing to our new robot platform, from our Minibots to our new Wolfgang robots, we have had to fight with some new problems. On the other hand we also were more motivated and that is why we managed to accomplish a lot of progress in our software as well as in the continued development of the Wolfgang hardware platform.
As part of working on our software, we have overhauled our vision. It is now better able to process and calculate the results of our CPU and graphics card in parallel.
At the moment multiple bachelor thesis are being worked on, which aim to use Fully Convolutional Neural Networks, to detect additional objects on the field. We aim to also detect robots and goalposts with this method.
The localization of our robot is being completely changed and improved right now. For this we use AMCL, a particle filter, to transform linepoints which have been found by our vision. A live demonstration video is available at: https://www.instagram.com/p/BtVlmNuFCGd/
Additionally to our localization we are also working on a world model. We aim to process the sensor input data from multiple robots and use the data to look at how the ball moved over time, which allows us to filter if we detect e.g. the ball in one place where it couldn’t possible be based on our previous measurements.
We have completely rewritten our behavior. We have developed a new description language which will allow us easier continued development of our behavior. For this we have developed our Dynamic Stack Decider and are currently working on publishing a paper about it.
For our path planning, the calculation which route our robot should take, we have switched to move_base.
For our animations and our walking we have switched to using splines. By using splines we are able to calculate smaller steps between the start of the motion and finishing the motion. This allows us to have less jerky movements.
We managed to send our motors significantly more signals per second. This way we are able to decide in more detail how the motor should move and thus accomplish a more precise movement in general.
We have built new foot sensors into our robots. By looking at the additional data we are generating this way we can stabilise our walking and our animations.
In the context of a bachelor thesis we are currently working on a system to calibrate our motors. Small inaccuracies between a measurement and the reality lead to large differences in how the robot is actually oriented. With this method large differences have already been found and fixed.
For the Symposium of the RoboCup world cup we are planning on a few more scientific papers. We will talk about these more at a later point in time and upload them on our publications page.
Our code is open source and can be viewed on Github:
We will take part in the German Open before we fly to Sydney. We will test our hard- and software there in detail and are excited for games we will get to play in Magdeburg.
Additionally to this Blog, Facebook and Twitter we will also publish images on Instagram. Our new Instagram page can be found here:
In the last few days, interested teenagers got to try out, what it is like to study computer science at our university. And like the last years, we offered one of the three projects the pupils could choose from. During this week, they did not just get a look at our daily lifes, but instead got to experience hands on, what it takes to make a robot play soccer.
Our new friend Cozmo, a robot produced by Anki, joined us for this years project, and thanks to its intuitive programming environment made it rather easy for the students to solve various problems and to programm and play games with or against Cozmo. Step by step they helped Cozmo to reach his dream goal to become a soccer star player.
In contrast to the last years, this time we tried a different learning concept, developed by a master student at our university. In various tasks the 13 children first learned how to work with the robot and his graphical programming environment, before later going on to use a “real” programming language. The main goal was to make Cozmo shoot a goal on friday.
Even though this was a challenging task, the students quickly recognized how to use Cozmos sensors to find the ball in the camera image, then calculate the position in the real world, to then figure out where he is supposed to move.
Besides working on the project, the pupils were also keen to see our campus, so we showed them around in the labs of TAMS. They got to control a robotic arm with a HoloLense and watch as Trixie searched for interestingly colored objects.