Sci-fi fanatics sometimes view highly developed AI as an evil overlord. Naturally, creatures fueled by AI might be exceptionally robust and potent. But if the time comes, will humans be able to exert control over highly intelligent AI?
Artificial intelligence is a buzzword you can find in your daily news, whether it’s about the scary AI-generated picture or the risks of creating potentially sentient intelligence.
Sci-fi fanatics sometimes view highly developed AI as an evil overlord. Naturally, creatures fueled by AI might be exceptionally robust and potent. But if the time comes, will humans be able to exert control over highly intelligent AI?
Unfortunately, experts fear that it might be challenging to exert control over a super-intelligent AI. The reason for this is simple: if AI becomes more intelligent than humans, then our own cognitive talents will be rendered obsolete. Unless we can fully understand the capabilities of the highly intelligent AI, we may never be able to subdue it.
True, but surely all AI is designed to get along with humans. Actually, that’s the case. However, a new article argues that we cannot design empathy towards humans without a thorough grasp of the situations AI can come up with.
Because of its many faces, a superintelligence “may be able to mobilise a diversity of resources to achieve objectives that may be unintelligible to humans, let alone controllable,” according to researchers.
Based on Alan Turing’s 1936 formulation of the “halting issue,” the team has arrived at their conclusion. It tries to figure out if a computer programme will finish (and stop) or if it will go on searching indefinitely. While Turing demonstrated this knowledge is attainable for certain programmes, it is impossible to acquire this information for all programmes.
The same may be said of AI designed never to hurt people, which may or may not come to that decision (and stop). In any case, it’s beyond the scope of human calculation and control. As a means of containing artificial intelligence, researchers have proposed isolating it from the internet or certain networks in order to restrict its possibilities, especially if it is extremely sophisticated.
For more such content, visit: https://bit.ly/3ijY5Gt