I didn't say it wants to do anything. It also doesn't want to build software. But why would it be the case that an AI told to build an air traffic control software could successfully do that, but an AI told to make sure planes arrive where they're supposed to safely and on time won't be able to figure out the rest?
Now, I'm not saying it's impossible for there to be something that makes the first job significantly easier than the second, but it's strange for me to assume that an AI would definitely be able to do the former soon, yet not the latter. I think it could be reasonable to believe it will be able to do neither or both soon, but I don't understand how we can expect the ability line to just happen to fall between software and pretty much everything else.
Now, I'm not saying it's impossible for there to be something that makes the first job significantly easier than the second, but it's strange for me to assume that an AI would definitely be able to do the former soon, yet not the latter. I think it could be reasonable to believe it will be able to do neither or both soon, but I don't understand how we can expect the ability line to just happen to fall between software and pretty much everything else.