Results 1 to 12 of 12

Thread: Future AI Arms Race

  1. #1
    Senior Contributor
    Join Date
    09 Oct 10
    Posts
    1,097

    Future AI Arms Race

    This is a general theory floating around that in the coming years the exponential growth in various technologies will converge to create the fourth industrial revolution. This will lead to changes more dramatic and more rapid than most can imagine and greater than we have certainly ever witnessed. You can study trends and history and make delineations, argue about starting points, but I happy to suggest that when historians look back, they wont label last year as the explosion, or any of the preceding years but I expect that year is very close.

    The key technologies probably rest around artificial intelligence and quantum computing. Soon these technologies will allow a select few companies and countries ahead of the trend to handle vast data and solve problems that will effectively remove humans from the process. The advantages in the market and in military capabilities are surely vast. It's rational for all parties to assume that this is a zero sum game. And that the winner can wipe out the majority of the gains made by all other parties historically. This makes me think the USA has the most to lose and that the current advantages it has even in the relevant future fields could be lost by other parties piggybacking on US progress and then making the key breakthroughs first and then basically advancing at an exponential rate off into the distance.

    The extension of the fourth revolution to being a zero sum game winner takes all is probably an exaggeration although I maintain its rational for all parties to operate on that assumption. If so can a strong case not be made for massive and overwhelming investment of defense budgets in this direction?

    I think we can expect the fourth revolution to hurdle us towards the fifth, surely the country to get ahead will most likely get there first. I label the fifth as Artificial Super Intelligence (ASI) (AI(https://en.wikipedia.org/wiki/Technological_singularity), an AI system billions of times smarter than the sum of all humanity, potentially dangerous to humanity, and obviously revolutionary. Nobody obviously agrees on likely dates but a median predictions/outright guess by experts in the field is 2040-2060 depending on the survey https://medium.com/ai-revolution/whe...t-ae5a6f128503. If one nation gets full control of this over others, we should assume they would rule the world. Far more likely is the threat of instability and even war in the years preceding its development if nations were getting close, and one nation felt another was going to get there before it. Its likely there will need to be major international agreements and a global Manhattan project that allow co-operation and control and the democratization of this tech and make it safe before its developed.

    Looking at the short term and the developing arms race, are countries like China and Russia (especially China) better positioned in the coming years because they are effectively dictatorships in some form that can direct and control investment and push in down specific corridors? The Chinese are building the worlds largest quantum research facility at the moment and hope to build a quantum computer by 2020 with millions times more power than all the computers in the world combined. They also have far more control in directing their "private" industry. Private industry also has potential to make these kinds of breakthroughs first and that potential grows ever year as less capital is required to make tech advances (software not hardware), companies like google move to trillion dollar companies and the private industry can easily compete for the worlds best minds and soon it will be about the worlds best self learning AI software to advance things further.
    Last edited by tantalus; 02 Mar 18, at 12:46.

  2. #2
    Former Staff Senior Contributor Ironduke's Avatar
    Join Date
    02 Aug 03
    Location
    Minneapolis
    Posts
    11,506
    Call me a luddite, but I'm of Musk's opinion regarding AI - I believe it could eventually pose an existential threat to humanity, whether physically or in terms of taking control of human society.

    Relativistic kill vehicles though, happens to be my favorite doomsday scenario. :-)

    The short term gains that will be seen by rival countries in developing it means there won't be any measures preventing its progress, so I think that it's coming creation will no doubt come to pass. Hopefully not in my lifetime. The idea of an artificial intelligence orders of magnitude more intelligent than a human is a pretty frightening scenario, and one of those things, once created, is a genie that can't be put back in the bottle.

  3. #3
    Senior Contributor Monash's Avatar
    Join Date
    01 Mar 10
    Location
    Sydney
    Posts
    1,592
    What happens as/when/if humans physically integrate with A.I. systems? I don't necessarily mean turn into 'Borgs' but rather what happens when human minds initially get real time linkages to the internet then start physically upgrading with hardware implants?
    Last edited by Monash; 04 Mar 18, at 11:25.

  4. #4
    Senior Contributor
    Join Date
    09 Oct 10
    Posts
    1,097
    Quote Originally Posted by Ironduke View Post
    Call me a luddite, but I'm of Musk's opinion regarding AI - I believe it could eventually pose an existential threat to humanity, whether physically or in terms of taking control of human society.
    The message being delivered by people like Musk is that even if the risks are extremely low we need to plan ahead of time and have the safety measures ready. And as we don't know the time frame or how long it will take to solve the problem, we should get on with this soon.

    Quote Originally Posted by Ironduke View Post

    The short term gains that will be seen by rival countries in developing it means there won't be any measures preventing its progress, so I think that it's coming creation will no doubt come to pass. Hopefully not in my lifetime. The idea of an artificial intelligence orders of magnitude more intelligent than a human is a pretty frightening scenario, and one of those things, once created, is a genie that can't be put back in the bottle.
    The upside is fairly attractive too mind you

  5. #5
    Senior Contributor
    Join Date
    09 Oct 10
    Posts
    1,097
    Quote Originally Posted by Monash View Post
    What happens as/when/if humans physically integrate with A.I. systems? I don't necessarily mean turn into 'Borgs' but rather what happens when human minds initially get real time linkages to the internet then start physically upgrading with hardware implants?
    Interestingly as we know very little about how to make artificial super intelligence safe an early theory is that integrating our minds may be the logical pathway in an attempt to insure that the AI has our values.

    The first expected step by some on becoming human and "machine" will be to have an extension of our brains in the cloud that we can connect to at will, greatly increasing our brain capacity in the same way evolution allowed us to greatly increase our capacity from our nearest primate cousins hundreds of thousands of years ago, allowing us to develop humour and music etc. Similar leaps might be expected, you should consider that you might get funnier and will take up the piano and match Mozart or beethoven...

    There is no particular reason to expect human beings arrived at some ceiling (via evolution), be it an emotional, moral or intelligence ceiling. It's obvious that greater intelligence outside of biology will be achieved, but many things should be possible in the other spheres even within our biological bodies.

  6. #6
    Former Staff Senior Contributor Ironduke's Avatar
    Join Date
    02 Aug 03
    Location
    Minneapolis
    Posts
    11,506
    Quote Originally Posted by tantalus View Post
    The message being delivered by people like Musk is that even if the risks are extremely low we need to plan ahead of time and have the safety measures ready. And as we don't know the time frame or how long it will take to solve the problem, we should get on with this soon.

    The upside is fairly attractive too mind you
    What is the upside in your book?

    The first expected step by some on becoming human and "machine" will be to have an extension of our brains in the cloud that we can connect to at will, greatly increasing our brain capacity in the same way evolution allowed us to greatly increase our capacity from our nearest primate cousins hundreds of thousands of years ago, allowing us to develop humour and music etc. Similar leaps might be expected, you should consider that you might get funnier and will take up the piano and match Mozart or beethoven...
    I'd rather keep my mind where it is. I don't particularly care for the idea of a backup of my mind that could be hacked in the cloud. :-) The overexposure of our personal lives attendant with the arrival and dominance of social media as a means of interaction is bad enough. I subscribe to the idea that no stranger should know anything about you, unless you choose to tell them.

    You been watching Altered Carbon?

  7. #7
    Senior Contributor
    Join Date
    09 Oct 10
    Posts
    1,097
    Quote Originally Posted by Ironduke View Post
    What is the upside in your book?

    Everything that has been achieved since we came down out of the trees and straightened our backs on towards hunter gather times and the stone age to the present moment has been executed on the bedrock of intelligence, more intelligence facilitates more progress and the opportunity to make a better world. This relates to any metric, medical, environmental, economical, moral etc. To defeat environmental threats like diseases, natural and cosmic disasters, or ageing and even the darker side of our own natures, we need to think clearer and to develop better technologies, social/political institutions and moral frameworks.

    Ofcourse there is a further step, which is cognitive and emotional enhancements so that we can be better than what we currently are, assuming you subscribe to the theory that we are product of a blind process of darwinian selection and not in the image of God, then there is some room for improvement. Greater intelligence can also help unlock further secrets of life and the universe, bigger brains might help us understand those answers too, but I would settle for the extinction of cancer and alzheimers.
    Quote Originally Posted by Ironduke View Post
    I'd rather keep my mind where it is. I don't particularly care for the idea of a backup of my mind that could be hacked in the cloud. :-) The overexposure of our personal lives attendant with the arrival and dominance of social media as a means of interaction is bad enough. I subscribe to the idea that no stranger should know anything about you, unless you choose to tell them.

    You been watching Altered Carbon?
    I haven't heard of Altered Carbon? Can't say I know enough about it to say if hacking could be a problem but i would caution against the assumption that because the mind you have now is the only mind you can have now, it will be worth keeping in the future.
    I subscribe to the idea that no stranger should know anything about you, unless you choose to tell them.
    For now, I would subscribe to your newsletter.

  8. #8
    Former Staff Senior Contributor Ironduke's Avatar
    Join Date
    02 Aug 03
    Location
    Minneapolis
    Posts
    11,506
    Quote Originally Posted by tantalus View Post
    Everything that has been achieved since we came down out of the trees and straightened our backs on towards hunter gather times and the stone age to the present moment has been executed on the bedrock of intelligence, more intelligence facilitates more progress and the opportunity to make a better world. This relates to any metric, medical, environmental, economical, moral etc. To defeat environmental threats like diseases, natural and cosmic disasters, or ageing and even the darker side of our own natures, we need to think clearer and to develop better technologies, social/political institutions and moral frameworks.

    Ofcourse there is a further step, which is cognitive and emotional enhancements so that we can be better than what we currently are, assuming you subscribe to the theory that we are product of a blind process of darwinian selection and not in the image of God, then there is some room for improvement. Greater intelligence can also help unlock further secrets of life and the universe, bigger brains might help us understand those answers too, but I would settle for the extinction of cancer and alzheimers.
    I'm a non-believer, atheist, and a secularist, and yes, I do subscribe to the blind process of Darwinian selection. That the amphibian is but an air-breathing evolution of a fish, the reptile is but an amphibian evolved for dry, arid environments, and the mammal is but a reptile that evolved for arctic/subarctic environments. And that we're just a particularly intelligent type of monkey, which in turn is an arboreal evolution of a rodent.

    Still though, even in spite of my secularist outlook, I do have my reservations about advancements in certain fields. New and different doesn't necessarily mean better. I don't mean to sound vague and banal, but there are things we lose as far as what it means to be human the further we develop. The question is, are they worth losing? And we are, after all, but only as far away as the reach of a madman in power in the wrong country from destroying the world many times over. If we ever were to develop relativistic space travel, we're a single pilot error, vehicle malfunction, or terrorist attack away from complete planetary annihilation, it would be a one-hit kill for an entire planet.

    I haven't heard of Altered Carbon? Can't say I know enough about it to say if hacking could be a problem but i would caution against the assumption that because the mind you have now is the only mind you can have now, it will be worth keeping in the future.
    You need to watch Altered Carbon, if you're a fan of the subject. It's the quintessential show to watch. 10 episodes, it's on Netflix. I prefer The Expanse.

  9. #9
    Senior Contributor
    Join Date
    09 Oct 10
    Posts
    1,097
    Quote Originally Posted by Ironduke View Post
    Still though, even in spite of my secularist outlook, I do have my reservations about advancements in certain fields. New and different doesn't necessarily mean better. I don't mean to sound vague and banal, but there are things we lose as far as what it means to be human the further we develop. The question is, are they worth losing? And we are, after all, but only as far away as the reach of a madman in power in the wrong country from destroying the world many times over. If we ever were to develop relativistic space travel, we're a single pilot error, vehicle malfunction, or terrorist attack away from complete planetary annihilation, it would be a one-hit kill for an entire planet.
    You should have reservations, to advance quickly could be to risk all of humanity, and all future humanity. Even the remotest risk should weigh very heavily on our minds if what you value is the sum total of human well being, present and future. There is a lot of potential human experience to be had in the sum of all the future against the next 50 years.

    But I mention evolution and its blindness to help make fruitful speculation. Evolution saw fit to arm us with tendencies towards deceit, violence, rape etc. It sought no tendency to adapt our immune system to recognise cancer cells as a danger or to find protection against alzheimers because we were unable to live long enough to experience them for the majority of our history on earth. And even if we did, its unclear there would have been value in offering such protection because we would have still ample time to breed and pass on our knowledge without eating essential food supplies. So, there is no reason to expect we have reached some optimal emotional or conscious state. There is a long list in theory worth losing.

    In a few decades it may be as illegal and immoral to drive on a public road unimpaired as it is considered to do so impaired today by alcohol or recreational drugs, as it will be dangerous for a human to compared to autonomous ai. Morality has shifted in regard views of ethnicity, gender, homosexuality. We should expect dramatic shifts as technology and the amount of intelligence on earth grow exponentially in the coming decades.


    Quote Originally Posted by Ironduke View Post

    You need to watch Altered Carbon, if you're a fan of the subject. It's the quintessential show to watch. 10 episodes, it's on Netflix. I prefer The Expanse.
    I will check it out, I am a big fan of The Expanse.

  10. #10
    Senior Contributor
    Join Date
    09 Oct 10
    Posts
    1,097
    The thing is we are risk averse, to falsely assume there is a lion in the jungle growth and go home is to lose a single meal, to falsely assume there isn't is to lose one's life. On the other hand, we also struggle to assess long term risk. I think there is a lot of individual variation also here, with some people more prepared to task risks, like many billionaire entrepreneurs and the sailors of the exploration. Who would we prefer to be in charge of developing AI. Clearly we will need the dreaded government regulation.

    I think you would like this video.


  11. #11
    Former Staff Senior Contributor Ironduke's Avatar
    Join Date
    02 Aug 03
    Location
    Minneapolis
    Posts
    11,506
    An insightful video. Thank you. I wish my thoughts on the subject were well-formed enough where I would be more able to discuss the subject, but it's one of those things where I have a long way to go before I can really say anything remotely intelligent on it.

    Altered Carbon trailer for anyone who's interested in the subject matter of AI, transhumanism, etc.



    Unrelated to the subject matter in this thread, but since I already mentioned it, a trailer for The Expanse:


  12. #12
    New Member
    Join Date
    21 Mar 18
    Posts
    1
    Quote Originally Posted by tantalus View Post
    This is a general theory floating around that in the coming years the exponential growth in various technologies will converge to create the fourth industrial revolution. This will lead to changes more dramatic and more rapid than most can imagine and greater than we have certainly ever witnessed. You can study trends and history and make delineations, argue about starting points, but I happy to suggest that when historians look back, they wont label last year as the explosion, or any of the preceding years but I expect that year is very close.

    The key technologies probably rest around artificial intelligence and quantum computing. Soon these technologies will allow a select few companies and countries ahead of the trend to handle vast data and solve problems that will effectively remove humans from the process. The advantages in the market and in military capabilities are surely vast. json formatter It's rational for all parties to assume that this is a zero sum game. And that the winner can wipe out the majority of the gains made by all other parties historically. This makes me think the USA has the most to lose and that the current advantages it has even in the relevant future fields could be lost by other parties piggybacking on US progress and then making the key breakthroughs first and then basically advancing at an exponential rate off into the distance.

    The extension of the fourth revolution to being a zero sum game winner takes all is probably an exaggeration although I maintain its rational for all parties to operate on that assumption. If so can a strong case not be made for massive and overwhelming investment of defense budgets in this direction?

    I think we can expect the fourth revolution to hurdle us towards the fifth, surely the country to get ahead will most likely get there first. I label the fifth as Artificial Super Intelligence (ASI) (AI(https://en.wikipedia.org/wiki/Technological_singularity), an AI system billions of times smarter than the sum of all humanity, potentially dangerous to humanity, and obviously revolutionary. Nobody obviously agrees on likely dates but a median predictions/outright guess by experts in the field is 2040-2060 depending on the survey https://medium.com/ai-revolution/whe...t-ae5a6f128503. If one nation gets full control of this over others 192.168.0.1, we should assume they would rule the world. Far more likely is the threat of instability and even war in the years preceding its development if nations were getting close, and one nation felt another was going to get there before it. Its likely there will need to be major international agreements and a global Manhattan project that allow co-operation and control and the democratization of this tech and make it safe before its developed.

    Looking at the short term and the developing arms race, are countries like China and Russia (especially China) better positioned in the coming years because they are effectively dictatorships in some form that can direct and control investment and push in down specific corridors? The Chinese are building the worlds largest quantum research facility at the moment and hope to build a quantum computer by 2020 with millions times more power than all the computers in the world combined. They also have far more control in directing their "private" industry. Private industry also has potential to make these kinds of breakthroughs first and that potential grows ever year as less capital is required to make tech advances (software not hardware), companies like google move to trillion dollar companies and the private industry can easily compete for the worlds best minds and soon it will be about the worlds best self learning AI software to advance things further.
    We can expect the fourth revolution to hurdle us towards the fifth

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Similar Threads

  1. Cyber Arms Race
    By Merlin in forum Operation Iraqi Freedom/Operation New Dawn
    Replies: 10
    Last Post: 18 Dec 09,, 12:13
  2. Australia fears Asian arms race
    By xrough in forum East Asia and the Pacific
    Replies: 30
    Last Post: 23 Sep 08,, 06:09
  3. Putin vows 'arms race' response
    By Ironduke in forum Europe and Russia
    Replies: 49
    Last Post: 24 Apr 08,, 14:36
  4. Palm Beach Arms Race
    By Ironduke in forum Small Arms and Personal Weapons
    Replies: 16
    Last Post: 10 Nov 07,, 23:53
  5. Putin: US 'imperialism' means new arms race
    By xrough in forum Europe and Russia
    Replies: 1
    Last Post: 01 Jun 07,, 14:36

Share this thread with friends:

Share this thread with friends:

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •