Elon Musk, Bill gates and Stephen Hawking all fear the dangerous threats of Artificial Superintelligence and should we all too?
Artificial Superintelligence is any AI that exceeds human levels of intelligence even slightly. However, any self-improving superintelligence is going to be sure to improve a lot very fast indeed. AI that reaches this level would soon be leagues ahead of us. This raises the question of how we can ensure that the ASI goals will still align with our goals even after losing control of it?
ASI is the hypothetical AI that not only mimics or recognizes human intellect and behavior; ASI is when computers become self-conscious and transcend the potential and capability of human intelligence. ASI will be potentially incredibly stronger at anything we do, in addition to replicating the multi-faceted intellect of human beings
ASI would have a greater memory and a faster ability to process and analyse data and stimuli. Also, the decision-making and problem solving capabilities of super intelligent beings would be far superior than those of human beings.
However, it is not easy to code human emotions and creativity. It is something incredibly special and unique.
128 likes
152 likes
102 likes
123 likes
93 likes
103 likes
57 likes
87 likes
49 likes
68 likes
Videos & Articles
Share your thoughts here
Designed with Mobirise - Go now