Sarah O'Connor (FT) warns students taking up AI to get them through their studies may be counter-productive;
Good careers depend on employees being able to demonstrate they add value & if they depend on AI to deliver what they claim to know, why would employers not just 'employ' artificial intelligence directly?
While clinging to AI might be understandable in a time of student stress & anxiety, educators (universities) should help them resist the siren call of AI!
I completely concur!
#AI
Unus Nemo likes this.
Unus Nemo
in reply to Emeritus Prof Christopher May • •@Emeritus Prof Christopher May
This reminds me of a discussion I had with a supervisor I once had, and very much appreciated, were I explained that the internet is not an equalizer. He had made a correlation between a colt (firearm) being the great equalizer as advertised. I brought up the following point. The colt is not an equalizer. If one is not trained and disciplined with the weapon they have no advantage. The same is true for the internet. You can search for answers though with no prior knowledge in field you have no way to verify the quality of the knowledge you found. It is not vetted. The same issue comes to head with AI.
Those of us that work with AI know that it is not always right. Quite often it is very wrong. Just like any expert can be. The only way to know that you are getting quality feedback from your AI tool is if you know something about what it is producing. For example, if I work with Stable D
... show more@Emeritus Prof Christopher May
This reminds me of a discussion I had with a supervisor I once had, and very much appreciated, were I explained that the internet is not an equalizer. He had made a correlation between a colt (firearm) being the great equalizer as advertised. I brought up the following point. The colt is not an equalizer. If one is not trained and disciplined with the weapon they have no advantage. The same is true for the internet. You can search for answers though with no prior knowledge in field you have no way to verify the quality of the knowledge you found. It is not vetted. The same issue comes to head with AI.
Those of us that work with AI know that it is not always right. Quite often it is very wrong. Just like any expert can be. The only way to know that you are getting quality feedback from your AI tool is if you know something about what it is producing. For example, if I work with Stable Diffusion but I know nothing of art or drawing then how do I know if the AI got the shadowing correct (I mention this as it typically will not unless you are very careful with your prompt). If you are using Generative AI for code then you will only know that the code produced is not inferior if you know how to code. The list goes on.
In conclusion, AI is a great tool for an expert that works in the field. It is not a substitute for learning and becoming an expert yourself.