October 30, 2018 at 1:12 am #29120
When I look at the submission videos, the images seem distorted!
Are we getting distorted images, when we drive manually in the gym the images are not distorted. Are we to “undistort” the images ourserlves?
ThanksOctober 30, 2018 at 1:42 am #29126
Hi Jon, thanks for posting.
That is correct that the evaluator will give distorted images, whereas using something like
manual_control.pyinside of the
gym-duckietownrepository will give you undistorted images. The reason this is the case is that we use the gym-duckietown simulator with AIDO, but also want to support outside research efforts as well. For AIDO, the images from the simulator will always be distorted, since the images from the real duckiebot will also be distorted.
Yes – you will need to undistort them if you want “rectified” images – but we do provide wrappers that do this for you. That being said, rectification is an expensive process, so some things to think about (depending on whether you use a learning or classic-robotics approach):
1. Do you really need to rectify? A learning system may not care, as long as its input image is consistent
2. Rectification is expensive + slow (especially on the full image), so maybe point wise rectification is what you are looking for if going the classic route. This is how the original Duckietown stack was written
There are ways you can make the simulator distorted when you drive around in it (
distortion=True), but in order to support some other ongoing research (separate from AIDO, which doesn’t necessarily need to mimic the real robot), we also allow undistorted images.
Hope this helps!
You must be logged in to reply to this topic.