It will shift a lot of human effort from generative to review. For example the core role of an engineer in many ways already is validation of a plan. Well that will become nearly the only role.
That assumes that the classes of problems that AI’s can solve remains stagnant. I don’t think that’s a good assumption, especially given that GPT4 can already self-review and refine its output.
It will take a very long time for people to believe and trust AI. That’s just the nature of trust. It may well surpass humant in always soon, but trust will take much more time. What would be required for an AI designed bridge be accepted without review by a human engineer?
It will shift a lot of human effort from generative to review. For example the core role of an engineer in many ways already is validation of a plan. Well that will become nearly the only role.
That assumes that the classes of problems that AI’s can solve remains stagnant. I don’t think that’s a good assumption, especially given that GPT4 can already self-review and refine its output.
It will take a very long time for people to believe and trust AI. That’s just the nature of trust. It may well surpass humant in always soon, but trust will take much more time. What would be required for an AI designed bridge be accepted without review by a human engineer?