I think that robots lead a cursed artificial life. They will never have true free will, because their entire thoughts and existence are programmed, and they usually exist to serve living beings. They are also unlikely to have consciousness of their own, let alone thoughts, feelings, desires, and will of their own, since they exist solely to obey commands we give them.
Even if a robot could have a free will and a consciousness, they will most likely never have a soul, so when their artificial life comes to an end, they will pass on into nothingness and oblivion.
A robot with free will and a consciousness sounds cool in fiction, especially in Futurama, but I don’t think we can do it, and even if we could, it might not be the way it turned out.
The other thing with robots is, if all those sci-fi movies are right, we should be careful about creating robots that could one day overrun, take over, or even displace and destroy mankind. If robots can do things that humans can’t, and survive things that humans can’t, then if we’re not careful, machines could take over the world. Thus would make the creation of robots with consciousness dangerous, unless they could independently figure out right and wrong for themselves, and their moral compass consisted of not taking over the world and destroying an entire species (namely us).