0
0 Comments

When I ran the command “python3 agent.py”, the matplotlib agent didn’t show up. Instead, it gives me an error saying that “ModuleNotFoundError: No module named ‘duckietown_slimremote'”
By the way, I did start the challenge container. The whole error message look like this:

[email protected]:~/gym-duckietown-agent$ python3 agent.py
Traceback (most recent call last):
File "agent.py", line 37, in <module>
env = gym.make("Duckietown-Lf-Lfv-Navv-Silent-v0")
File "/usr/local/lib/python3.6/dist-packages/gym/envs/registration.py", line 167, in make
return registry.make(id)
File "/usr/local/lib/python3.6/dist-packages/gym/envs/registration.py", line 119, in make
env = spec.make()
File "/usr/local/lib/python3.6/dist-packages/gym/envs/registration.py", line 85, in make
cls = load(self._entry_point)
File "/usr/local/lib/python3.6/dist-packages/gym/envs/registration.py", line 14, in load
result = entry_point.load(False)
File "/usr/local/lib/python3.6/dist-packages/pkg_resources/__init__.py", line 2322, in load
return self.resolve()
File "/usr/local/lib/python3.6/dist-packages/pkg_resources/__init__.py", line 2328, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
File "/home/steven/gym-duckietown-agent/gym_duckietown_agent/envs/__init__.py", line 1, in <module>
from gym_duckietown_agent.envs.simplesimagent_env import SimpleSimAgentEnv
File "/home/steven/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 5, in <module>
from duckietown_slimremote.pc.robot import RemoteRobot
ModuleNotFoundError: No module named 'duckietown_slimremote'
Answered question