New OOTS products from CafePress
New OOTS t-shirts, ornaments, mugs, bags, and more
Page 4 of 12 FirstFirst 123456789101112 LastLast
Results 91 to 120 of 347

Thread: The Singularity

  1. - Top - End - #91
    Ogre in the Playground
     
    RPGuru1331's Avatar

    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by Dragon Raptor View Post
    OK, I apologize for being overly dismissive of climate change. I know it's not that simple, and if nothing is done about it there will be a lot of hardship for people going through the adjustment period. But in the long term, it isn't a civilization ending threat. There's still going to be some kind of technological human society at the end. A singularity could potentially wipe us out permanently.
    Would you prefer "The end of civilization as we know it"? It would destroy the global economy, ruin vast amounts of infrastructure, and kill potentially billions, and certainly millions. Rising sea levels would destroy a great deal of drinkable water, which is already a resource that society - even western society - is running low on. It would destroy everything we know about raising crops by drastically alterring weather patterns, and ruin considerably more farmland than it creates.

    Humans would almost certainly survive (provided it doesn't create a nigh permanent self-sustaining cycle and eventually drive the planet to become Venus-like in temperature, I suppose, but I sincerely hope that's a fantasy), and probably create a civilization again. It would be a far lesser one, with a great deal of its capabilities lost, however.

    Granted, we almost certainly need to change, massively, as is, and it will likely not be positive, but not in so drastic a way.

    At any rate, there's a false dichotomy here. We can work towards both problems.
    I actually don't say this to say that we shouldn't discuss problems that may very well come up later. However, treating it as a dire incoming emergency, whilst ignoring extant ones, is depressingly common amongst Singularity Enthusiasts; see prior for an example.
    Again: The problem is not that the issue is discussed. The problem is the rhetoric that paints potential problems for technologies we have no idea how to implement as being immediate threats, and ignoring the actual immediate problems. And again: The singulitarian wiki is a great example of this, with all sorts of grand, far future science fiction scenarios for how humanity might end, while just ignoring the elephant in the room (Well, not the possibility of MAD, that one they actually did cover).


    I have no idea what that has to do with anything. By "we" I was referring to humanity as a whole.
    Then it is even more wrong. I can see the mythical underpinnings behind USA surviving intact, as is, alone, but not one for the entire world.
    Last edited by RPGuru1331; 2012-11-24 at 01:35 AM.
    Asok: Shouldn't we actually be working?
    And then Asok was thrown out of the car.

  2. - Top - End - #92
    Titan in the Playground
    Join Date
    May 2007
    Location
    Tail of the Bellcurve
    Gender
    Male

    Default Re: The Singularity

    Quote Originally Posted by Poison_Fish View Post
    And in the unlikely event that global warming doesn't screw up the food supply horribly, there's fairly good evidence that neither the intensity or quantity of modern agriculture is sustainable over the long term either. Between soil exhaustion and erosion, aquifer depletion, the exhaustion of phosphate and other natural resources necessary for artificial fertilizer, the continued loss of farmland to urban expansion and the ever increasing demand for food imposed by a growing population, the agricultural future is not looking sunny.

    And compromised food supply is not good news for the world.
    Blood-red were his spurs i' the golden noon; wine-red was his velvet coat,
    When they shot him down on the highway,
    Down like a dog on the highway,
    And he lay in his blood on the highway, with the bunch of lace at his throat.


    Alfred Noyes, The Highwayman, 1906.

  3. - Top - End - #93
    Ogre in the Playground
     
    Poison_Fish's Avatar

    Join Date
    Feb 2006
    Gender
    Male

    Default Re: The Singularity

    Quote Originally Posted by warty goblin View Post
    And in the unlikely event that global warming doesn't screw up the food supply horribly, there's fairly good evidence that neither the intensity or quantity of modern agriculture is sustainable over the long term either. Between soil exhaustion and erosion, aquifer depletion, the exhaustion of phosphate and other natural resources necessary for artificial fertilizer, the continued loss of farmland to urban expansion and the ever increasing demand for food imposed by a growing population, the agricultural future is not looking sunny.

    And compromised food supply is not good news for the world.
    And let's not forget how it also gets compounded by issues in distribution, infrastructure, even the market economy that drives it. It's still just one factor among many if we are even trying to look at carrying capacity on Earth too.
    Last edited by Poison_Fish; 2012-11-24 at 02:06 AM.

  4. - Top - End - #94
    Titan in the Playground
    Join Date
    May 2007
    Location
    Tail of the Bellcurve
    Gender
    Male

    Default Re: The Singularity

    Quote Originally Posted by Poison_Fish View Post
    And let's not forget how it also gets compounded by issues in distribution, infrastructure, even the market economy that drives it. It's still just one factor among many if we are even trying to look at carrying capacity on Earth too.
    Distribution, economy and infrastructure issues are ones that it's not unreasonable to think could be solved by humans. We probably won't do a good job of it, but those worry me less than permanently damaging the world's quite finite supply of arable land - which we can't just go make more of on our own.

    Put slightly differently, one way to look at this is that the entire world economy is propped up, and has only arisen because of, massive environmental externalities. These resources are not unlimited however, and their depletion is one of these days going to bite us in the ass. Problems within the economy worry me significantly less than what happens when the ecology that supports that economy can no longer do so. The first is a very hard problem within the system, the other is the foundation of that system breaking, which makes the first problem sort of moot anyway.
    Blood-red were his spurs i' the golden noon; wine-red was his velvet coat,
    When they shot him down on the highway,
    Down like a dog on the highway,
    And he lay in his blood on the highway, with the bunch of lace at his throat.


    Alfred Noyes, The Highwayman, 1906.

  5. - Top - End - #95
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    You're really out of your depth if you think your paper indicates massive understanding that is close.
    I think it indicates that we have already introduced very non-standard genes into complex organisms from conception, and the designer-baby industry doesn't even require that- it merely requires the identification of which genes encode which characteristics, coupled with embryonic screening. That technology already exists, already has customers, and is very likely to have serious implications for our society.

    Is there some point to that paper you linked, or were you expecting me to sift it for clues?
    That's because the popular/singularity use calls for artificial, sapient intelligences. And don't try to shrug that off; you were talking about ethics as regards AI. Those things will perhaps be important, but not for a very long time.
    I'm sorry, but you seem to be repeatedly confusing me with someone making different arguments. I am not making definite predictions of post-human AI by the year 2050, or even 2150. I am saying it is likely that sentient AI will be developed eventually, and those ethical problems will not have gone away in the meantime.
    It's not really a chicken and egg argument. If you concede that human population is what permitted that, then you concede that we're basically done with that rapid expansion...
    Excuse me? Over here? Hello? Hi! I'm the guy who's arguing that *current* rates of technological research will probably be more than adequate to bring about large-scale adoption of genetics, nanotech and AI. I think we met at that party yesterday. Oh, you wanted to talk to Kurzweil? Naw, sorry man, haven't seen him.
    It assumes a stagnant past and minimizes advances made in the past.
    I don't believe it is reasonable to assert that rates of technological progress during the paleolithic were comparable to those during the classical period and those again to during the victorian era. Let me say this a third time: Growing populations could make progress accelerate, but I'm not assuming acceleration. I'm just assuming it won't suddenly stop dead.

  6. - Top - End - #96
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    AI may very well be such a problem in the future, but I'm referring to something more concrete and current than that. Specifically, Global Climate Change and Peak Oil (or more generally, the need for sustainable energy in quantities that will power our civilization). The latter is almost certainly solvable, provided we act on the solutions. The former, I don't know...
    I was under the impression these were the same problem. The general concensus is that global climate change is being caused by anthropogenic CO2 emissions. Ergo, once we hit peak supply for the various fossil fuels, CO2 emissions will definitionally go down.

    Sure, there are technically vast untapped reserves of these fuels left, but as the expense of extraction increases, nuclear and renewable energy, plus simple conservation/efficiency measures, will become more competitive on purely economic grounds. (This is one area where nanotech is already paying dividends, by improving solar cell fabrication.) The only real doomsday scenario I see here is if we manage to set off trapped undersea methane deposits, and boost global temperatures by several degrees before that transition can happen. I won't say it's impossible, but it doesn't strike me as immediately likely.

    This isn't to say that I don't believe governments shouldn't sponsor research into alternate energy sources, hybrid cars, hydrogen infrastructure, etc. We absolutely should. A smooth transition away from oil/coal/gas would be far less traumatic than one induced by a series of energy recessions, and there may be significant first-mover advantages to getting into these markets early. But I'm skeptical that our species would be doomed either way, or that further technological research would crawl to a sudden halt.
    Quote Originally Posted by warty goblin View Post
    And in the unlikely event that global warming doesn't screw up the food supply horribly, there's fairly good evidence that neither the intensity or quantity of modern agriculture is sustainable over the long term either. Between soil exhaustion and erosion, aquifer depletion, the exhaustion of phosphate and other natural resources necessary for artificial fertilizer, the continued loss of farmland to urban expansion and the ever increasing demand for food imposed by a growing population, the agricultural future is not looking sunny.
    Even in the worst-case scenario, you're probably going to be looking at massive greenhouses practicing indoor hydroponics, and sewage treatment to recover phosphates and other minerals. Would that have high initial setup costs? Yes. Would this have a disproportionate impact on the world's poor? Yes. Should we invest in social policies to improve living standards and promote sustainable use of natural resources, and seek to avoid this scenario if possible? Absolutely. But in and of itself, I don't think difficulties with growing food will cause our civilisation to grind to a halt.

    What could happen is that political instability induced by these economic stresses, coupled with major population pressures, could lead to some kind of large-scale nuclear exchange. By and large, though, I think we're over the worst of that. Most countries with access to nukes either have modern economies or are rapidly modernising, so they *should* be able to transition to a post-oil world relatively painlessly. Fingers crossed.

    Hell, the environmental and economic stresses induced by such a crisis could even fuel research into singularity-associated technologies- GM crops for food supply, microfilters for water treatment, AI expert systems for healthcare in poorer nations... oh, right. That's already happening.

  7. - Top - End - #97
    Ogre in the Playground
     
    Poison_Fish's Avatar

    Join Date
    Feb 2006
    Gender
    Male

    Default Re: The Singularity

    Quote Originally Posted by Carry2 View Post
    I was under the impression these were the same problem. The general concensus is that global climate change is being caused by anthropogenic CO2 emissions. Ergo, once we hit peak supply for the various fossil fuels, CO2 emissions will definitionally go down.
    Human emission will fall. You are forgetting that the the ocean contains a huge amount of CO2, as temperature increases it, it's ability to carry that much decreases. You also get more water being released by polar cap melting. At some point you set off a vicious cycle, so even with reduction, you still have issues. All the CO2 has to be absorbed by something, and with continuous land clearing, you are loosing those scrubbers. It's not just a game of balancing human emissions because carbon cycle functions on a system beyond that.

  8. - Top - End - #98
    Titan in the Playground
     
    Ravens_cry's Avatar

    Join Date
    Sep 2008

    Default Re: The Singularity

    Indeed. A liquids ability to hold dissolved gasses is *decreased* with an increase in temperature, which exacerbates the situation, creating a cycle.
    I don't think it would destroy civilisation, though it would wreak a lot of havoc. We are already seen effects like the pine beetle infestation in BC. Warmer winters further allowed them to increase their range. Climate changes could turn bread baskets into deserts. Shifting ocean currents could send the UK could make the UK a *lot* colder.
    I doubt it would mean the end of civilisation, it's too gradual for that, but it would do a lot of damage, both to us and the environment, and cause a lot of suffering.
    Last edited by Ravens_cry; 2012-11-24 at 02:39 PM.
    Quote Originally Posted by Calanon View Post
    Raven_Cry's comments often have the effects of a +5 Tome of Understanding

  9. - Top - End - #99
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    I don't mean to dismiss or belittle these concerns, or suggest that we shouldn't avail ourselves of every means possible to offset those dangers. We should. But short of an actual human extinction event, I'm skeptical that such a crisis will prevent, or even seriously slow down, the wide-scale adoption of singularity-associated technologies in the relatively near future. Such technologies might even be part of the solution.


    .
    Last edited by Carry2; 2012-11-24 at 02:55 PM.

  10. - Top - End - #100
    Ogre in the Playground
     
    RPGuru1331's Avatar

    Join Date
    Oct 2008

    Default Re: The Singularity

    I was under the impression these were the same problem. The general concensus is that global climate change is being caused by anthropogenic CO2 emissions. Ergo, once we hit peak supply for the various fossil fuels, CO2 emissions will definitionally go down.
    In short, no. We'd need new power sources even if emissions weren't as huge an issue. That's on top of the issues others raised.

    But I'm skeptical that our species would be doomed either way
    Which is why I don't say that. Humanity will survive, and likely recreate a civilization; even if Peak Oil isn't solved. But it will be extraordinarily painful.

    Even in the worst-case scenario, you're probably going to be looking at massive greenhouses practicing indoor hydroponics, and sewage treatment to recover phosphates and other minerals.
    ...the latter is extraordinarily unlikely (Due to the necessary scale of development it would entail, not technologically IIRC), the former has the small problem of 'need fertilizer to continue'. The scale almost certainly needs to drop. Which is a problem for things as they are. And both are extremely doubtful if people ignore existing solutions that are less painful to begin with.

    But I'm skeptical that ... further technological research would crawl to a sudden halt.
    or even seriously slow down, the wide-scale adoption of singularity-associated technologies in the relatively near future
    Then you don't really understand how dependent the world is on its globalized economy, nor do you understand how crippled basic research into new fields will be while we try to work out our new niches. Seriously, do you have any idea where the stuff you use on a daily basis comes from? Or where lab materials come from?

    Basic Research is *already* under assault as a concept, you don't think it'll be cannibalized entirely in the event of a massive and actual need for results this instant?

    Hell, the environmental and economic stresses induced by such a crisis could even fuel research into singularity-associated technologies- GM crops for food supply, microfilters for water treatment, AI expert systems for healthcare in poorer nations... oh, right. That's already happening.
    If the lattermost is like your AI Scientist, I think we're safe from a robot rebellion for quite some time. The other two aren't threats to survival (although GM foods have other issues), and I suspect desalinization will trump filters for concern; further, you're again ignoring that research into pie in the sky ideas will drop considerably while we find our feet again.

    singularity-associated
    Reminder: The singularity isn't remotely evidence-based until population can be ruled out as a cause of increased technological growth. The fall of birth rates and population growth as quality of life increases is well documented.
    Last edited by RPGuru1331; 2012-11-24 at 04:33 PM.
    Asok: Shouldn't we actually be working?
    And then Asok was thrown out of the car.

  11. - Top - End - #101
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    In short, no. We'd need new power sources...
    We already have 'new' power sources, in the sense of geothermal, solar and next-gen nuclear power (including, eventually, fusion plants.) They're just not being widely adopted at the moment because of pricing differences. We can also substantially reduce our energy consumption through more efficient infrastructure, reduced consumption, and domestic conservation measures.
    ...the latter is fantasy, the former has the small problem of 'need fertilizer to continue'. The scale almost certainly needs to drop. Which is a problem for things as they are.
    Why is the latter fantasy? We already have sewage treatment plants, and phosphorus reclamation seems to be technically viable.
    Then you don't really understand how dependent the world is on its globalized economy, nor do you understand how crippled basic research into new fields will be while we try to work out our new niches.
    And I am saying that those 'new niches' might well be dependant on nanotech, genetics and AI. Earlier in this thread you argued that the Post-Roman Dark Ages, the most famous collapse of civilisation in history, contrary to popular belief, were not technologically stagnant (despite showing signs of major demographic collapse, abandonment of urban infrastructure and decline of the literate elite.) Do you think the next hundred years will somehow do substantially worse?

  12. - Top - End - #102
    Barbarian in the Playground
     
    NecromancerGuy

    Join Date
    Apr 2010
    Location
    Night Vale
    Gender
    Male

    Default Re: The Singularity

    Quote Originally Posted by GenericGuy View Post
    Unfortunately on forums that don’t normally deal with futurology and Artificial Intelligence; arguments against the Singularity go “it’s scary and bizarre, I don’t like it, and therefore it won’t happen,” and those for it will say basically “it’s cool, I want it to happen, and therefore it will happen.” I recommend going to sites that do deal with this topic more exclusively to get better well thought out information, but if really just want random playgrounders opinions. The Singularity will most probably happen in the end of the 21st century, the rate computer science is progressing and backwards engineering of the human brain means we will be able to create and artificial mind of human level. By virtue of this mind being artificial, with functions inherently “superior” to an organic one (being able to think at the speed of light for one and a Eidetic memory), unless you believe human level intelligence is the limit of possible intelligence (which is extremely doubtful) this new mind can improve itself.
    unless those who fear change take legal action to prevent the curious ones from researching, it will happen regardless of what a doomsayer says.

    Edit: and totally missed the fact that there were multiple pages already
    Last edited by Astral Avenger; 2012-11-24 at 04:43 PM.
    Avatar by TheGiant
    Long-form Sig

  13. - Top - End - #103
    Ogre in the Playground
     
    RPGuru1331's Avatar

    Join Date
    Oct 2008

    Default Re: The Singularity

    We already have 'new' power sources, in the sense of geothermal, solar and next-gen nuclear power (including, eventually, fusion plants.) They're just not being widely adopted at the moment because of pricing differences. We can also substantially reduce our energy consumption through more efficient infrastructure, reduced consumption, and domestic conservation measures.
    Yes, I'm aware of all that. I'm also aware of the fact that adoption of these things is being actively opposed, or hampered significantly, in a lot of the planet. I did say immediately that this was almost certainly solvable, after all.

    Also, I suspect you sincerely underestimate the problems with nuclear power, given extant freshwater shortages and the growing number of them in the future, and the need for uranium and active opposition to nuclear proliferation; the latter being well-founded, the former being a problem that will increase as global supply and infrastructure is disrupted.

    Why is the latter fantasy? We already have sewage treatment plants, and phosphorus reclamation seems to be technically viable.
    I editted that because I decided to distinguish between 'impossible for considerable lengths of time' and 'You won't see that happen because of very real human factors'. I'm more than a little skeptical of our will to enact such a solution when very real, much cheaper solutions exist right now.

    And I am saying that those 'new niches' might well be dependant on nanotech, genetics and AI.
    If it I'm not being sufficiently clear, let me fix that for you now; this is actual fantasy. The solution to our problems will not lie in 'just around the corner' technology that is a century or more in the future when there are real solutions, right now, that we won't act on.

    And it almost certainly won't be that kind of delayed result when we legitimately need these things yesterday.

    Do you think the next hundred years will somehow do substantially worse?
    By your reckoning, likely they will be. I don't argue these times will be stagnant; I said we'll be focused on results now, not on basic research that MIGHT yield things we need after 50 years of time and effort. That's not kind to 'singularity' nonsense at all. New and better applications of existing technology is not stagnancy, but it isn't marching bravely towards the grand new world you imagine to be inevitable.
    Last edited by RPGuru1331; 2012-11-24 at 05:04 PM.
    Asok: Shouldn't we actually be working?
    And then Asok was thrown out of the car.

  14. - Top - End - #104
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    If the lattermost is like your AI Scientist, I think we're safe from a robot rebellion for quite some time. The other two aren't threats to survival (although GM foods have other issues)...
    You're claiming that genetics, nanotech and AI are pie-in-the-sky concepts, and that research into them will be ruled out by a demand for practical applications. Even when I show you that some of the most promising techniques for dealing with the problems of the 21st century are based on these technologies.
    Reminder: The singularity isn't remotely evidence-based until population can be ruled out as a cause of increased technological growth. The fall of birth rates and population growth as quality of life increases is well documented.
    Reminder: I'm not arguing for a singularity in the sense of the overnight appearance of godlike cybernetic intelligences. (Though, if AIs did become the planet's dominant life-form, it seems likely that the earth could sustain a far larger population of their minds than ours, to say nothing of how much more powerful such minds might be. I don't know. Clearly, this has never happened in our experience, in the same way that a meteor impact has never wiped London off the map. But I can't entirely rule out the possibility.)

    I am arguing (in the 'this will probably happen' rather than 'this is inherently a great idea' sense) that the wide-scale adoption of technologies associated with the concept- genetics/nanotech/AI- will have major social implications in the relatively near future.

    I apologise if my phrasing was misleading on this point. But I really need to hammer this point home if this conversation is to continue.

  15. - Top - End - #105
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    Yes, I'm aware of all that. I'm also aware of the fact that adoption of these things is being actively opposed, or hampered significantly, in a lot of the planet.
    And I agree that this is a bad thing. I agree that we should take action on that front. We are not disagreeing here. Isn't it wonderful that we don't disagree!

    But even the painful, agonising version of this transition isn't going to suddenly see people throw away their best tools for ultimately solving these problems. I don't want to see that version! But there is no version of this future that involves people voluntarily abdicating from the use of genetics, nanotech and AI, any more than there is a version of this scenario that sees people turning to wooden windmills and charcoal braziers to satisfy their energy demands. If it's a choice between going nuclear or not having central heating, I know which way the world will turn, and the choice between that or nano-fabricated solar arrays is even easier.
    If it I'm not being sufficiently clear, let me fix that for you now; this is actual fantasy. The solution to our problems will not lie in 'just around the corner' technology that is a century or more in the future... I don't argue these times will be stagnant; I said we'll be focused on results now, not on basic research that MIGHT yield things we need after 50 years of time and effort.
    Let me fix this for you now: Nanotech is already building better solar cells and water filters, genetics is boosting crop yields and preventing debilitating diseases, and AI is making medical treatment cheaper and accelerating commercial research. Today. Right now. As we speak. Contemporaneously, even. This is not 100 years in the future. This is news at nine. Yesterday.

  16. - Top - End - #106
    Banned
     
    Terraoblivion's Avatar

    Join Date
    Mar 2008
    Location
    Århus, Denmark
    Gender
    Female

    Default Re: The Singularity

    Uh, I'm not really sure that it's terribly accurate to think of genetics, nanotechnology and AI as technologies. They're fields of research that a lot of individual technologies can be and have been derived from, but there's still a lot of very basic research to be done before it even approaches something like a technological singularity. Basic research that might not be done when more immediate problems press on where they might present solutions.

    Look at genetics for example, mostly it comes down to discovering that a gene correlates with something much of the time, without any real understanding of why or why it only does sometimes. And that's not even getting into how few things are governed by simple, easily identifiable gene sequences and puzzling questions like why simpler lifeforms like worms have so many more genes than mammals. Basically, we're just starting to dip our toes in genetics and haven't gotten a much deeper understanding of the issue than early metallurgists working on perfecting exactly the right mix for bronze, without knowing much of anything about why it worked that way. And it took us 143 years since the discovery of DNA to get that far.

    AI also has the problem that we essentially don't even know what we're trying to make, given that we don't know what intelligence even is or how it works. So I kinda fail to see how research into general fields will lead to some kind of rapture for nerds if we focus on practical applications of those fields.
    Last edited by Terraoblivion; 2012-11-24 at 05:27 PM.

  17. - Top - End - #107
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by Terraoblivion View Post
    Basically, we're just starting to dip our toes in genetics and haven't gotten a much deeper understanding of the issue than early metallurgists working on perfecting exactly the right mix for bronze, without knowing much of anything about why it worked that way.
    Yeah, but the argument I'm hearing is analagous to early farmers, in response to a need for more forest clearance, abandoning the use of bronze as 'pie in the sky' technology, rather than, say, using bronze axes for the job. Bronze, by itself- just bronze- had a rather significant impact on the society of the time. And embryonic screening, by itself- just embryonic screening- is also likely to be pretty big. I would imagine.
    AI also has the problem that we essentially don't even know what we're trying to make, given that we don't know what intelligence even is or how it works. So I kinda fail to see how research into general fields will lead to some kind of rapture for nerds if we focus on practical applications of those fields.
    Biological evolution also didn't know what it was trying to make, and you could say was pretty exclusively focused on 'practical applications'. It still got there eventually, and while by all indications we're still pretty low down on the evolutionary ladder of intelligence in terms of creating synthetic imitations, we appear to be catching up orders of magnitude faster.

  18. - Top - End - #108
    Titan in the Playground
    Join Date
    May 2007
    Location
    Tail of the Bellcurve
    Gender
    Male

    Default Re: The Singularity

    That genetics and nanotechnology will be important tools for dealing with the mess we've gotten ourselves into I don't think is particularly arguable. If civilization doesn't fold up like a wet towel, these things will be pursued.

    What is arguable is that the pursuit of these fields in that application will somehow run parallel to development of cyber-nano superhumans or singularity type AIs or anything like that.

    If food prices go up, the market for designer babies goes down, because just keeping food on the table eats more and more household income. Having children at all is expensive, and getting worse. Spending a lot of money before getting your child is probably not that attractive a proposition for most people, particularly if their budgets are already stressed. $10,000 to make sure Tina has baby blues and doesn't have genetic disabilities she probably wouldn't have anyway is a hell of a way to blow the downpayment on a house.

    Particularly when you still need the house.

    Ditto nanotech. If the demand is for cheap, clean energy, that's where research will be focused. A 20% nanotech solar panel doesn't tell you how to let me install a gun in my middle finger*, or make nanites swim around my bloodstream and fight cancer or regrow my liver or whatever.


    And if AI develops to help with these tasks, AI develops to do that. Not to be general sapient systems.

    I don't think anybody is saying we should abandon these areas of research. Just that the directions we should push them isn't towards singularities or transhumanism. There are better problems to solve than intelligence on a hard drive or keeping me alive and consuming for another hundred years. Our research funding and brain power is finite. We should focus them on the problems that matter.


    *Come on, of course you put your finger-gun in the middle finger.
    Blood-red were his spurs i' the golden noon; wine-red was his velvet coat,
    When they shot him down on the highway,
    Down like a dog on the highway,
    And he lay in his blood on the highway, with the bunch of lace at his throat.


    Alfred Noyes, The Highwayman, 1906.

  19. - Top - End - #109
    Ogre in the Playground
     
    RPGuru1331's Avatar

    Join Date
    Oct 2008

    Default Re: The Singularity

    Let me fix this for you now
    Do you not get tired of aping my expressions?

    But even the painful, agonising version of this transition isn't going to suddenly see people throw away their best tools for ultimately solving these problems.
    The 'best tools' are the ones that actually exist.

    Nanotech is already building better solar cells and water filters, genetics is boosting crop yields and preventing debilitating diseases,
    Nanotech, not nanite von neumann machines. Minor genetic alterations that are only somewhat more effective than crop husbandry, not gene tailoring. Dedicated, extremely limited AI, not sapient systems.

    Yeah, but the argument I'm hearing is analagous to early farmers, in response to a need for more forest clearance, abandoning the use of bronze as 'pie in the sky' technology, rather than, say, using bronze axes for the job.
    And what I'm hearing from you is more like "Guys, we need to figure out this running with scissors problem" to people who don't have steel.

    Biological evolution also didn't know what it was trying to make and you could say was pretty exclusively focused on 'practical applications'
    And it took more than 3 billion years to get there. If you want to claim that kind of timescale for science fiction AI, I won't argue with you. But that's well beyond 'not an imminent problem'.

    Also, it's not strictly true that biological evolution focuses exclusively on practical applications. Most mutations are neutral; IE they neither help nor hinder the organism. A lot of selection does happen by chance, without an actual adaptive use; for instance, panda fur has no evolutionary meaning. I suspect, but as I'm not a biologist, can't confirm, that some immediately meaningless adaptations form a stepping stone for later useful ones.

    *Come on, of course you put your finger-gun in the middle finger.
    I'd think the pointer, m'self.
    Last edited by RPGuru1331; 2012-11-24 at 06:50 PM.
    Asok: Shouldn't we actually be working?
    And then Asok was thrown out of the car.

  20. - Top - End - #110
    Ogre in the Playground
     
    Seharvepernfan's Avatar

    Join Date
    Mar 2011
    Location
    Cydonia

    Default Re: The Singularity

    The rich and powerful have always spent their time and money on things that don't benefit the rest of the world at all (or doesn't benefit it at the moment, anyway). I would be very surprised if there weren't rich people spending a lot of money/brains/time on creating AI/nanotech/genetic engineering/what-have-you right now (if not for years by now).

    Using current technology and practices to fix current problems just isn't going to work. It's not working well enough right now, it's not going to be working well enough in the future when the problems are worse (probably exponentially so). I personally think that unless we get the singularity to happen, we're either going to wipe ourselves out, or we're going to suffer some big problem that sets us back a few hundred or thousand years (which makes us more vulnerable to mother nature's timebomb disasters like yellowstone).

    Trying to guess if the singularity will or won't happen is a moot exercise. We just don't know enough to say. I personally think it will, if it hasn't already, and I think it will within my lifetime - though I'm not too optimistic about it, for myself anyway. It may very well not happen, but I think it will. I think we're much closer than many people realize. I'm not so sure if it's going to be a singularity that we like, however.

    Lordseth said it best in post #45.

    However, I find myself agreeing with Carry2 on the things he's said. (and you're very funny!)

    I can't help but mention that the whole global-warming/climate change stuff is very contentious right now. Many people say that the planet is just naturally going back to an opposite-iceage as it has many times before (actually, earth has been warmer and wetter than it is today for most of its' history). I do think that we need to stop polluting and all that, right now.

    I also can't help but mention that the technology for cars to run 1000 miles on a tank of gas exists. Not to mention cars that run on water. Not to mention all the stuff tesla was working on that mysteriously disappeared after his death. The rich and powerful already have technology greater and more energy-efficient than what is in use today, they just don't want us to use it yet. They will wait for oil to run out before they release the better stuff - they're still making money on oil.
    Spoiler: Ironcage Keep
    Show
    Initiative:

    - Leo
    - Enemies
    - Frith (Light, 92 rounds), Obergrym (rage 5 rounds, 14/17 hp), Melrik - CURRENT
    - Enemies
    - Jade
    - Enemies

  21. - Top - End - #111
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by warty goblin View Post
    If food prices go up, the market for designer babies goes down, because just keeping food on the table eats more and more household income. Having children at all is expensive, and getting worse. Spending a lot of money before getting your child is probably not that attractive a proposition for most people, particularly if their budgets are already stressed. $10,000 to make sure Tina has baby blues and doesn't have genetic disabilities she probably wouldn't have anyway is a hell of a way to blow the downpayment on a house.
    Yeah, but it's a bargain if it allows you to select a subset of her parents' genes statistically associated with a 20-point boost to IQ. (Hopefully without serious side-effects.)

    It's certainly fair to point out these technologies will initially be out of the reach of the most economically vulnerable, which is particularly unfair, and should perhaps be addressed through deliberate government policies to ensure equal access, maybe even on an international basis. But over time, I reckon improvements in automation and economies of scale would make the process pretty affordable. (Getting your genome sequenced used to cost millions- nowadays you can get it done for a few thousand dollars, and that will probably drop at least another order of magnitude by the end of the decade.)
    I don't think anybody is saying we should abandon these areas of research. Just that the directions we should push them isn't towards singularities or transhumanism. There are better problems to solve than intelligence on a hard drive or keeping me alive and consuming for another hundred years. Our research funding and brain power is finite. We should focus them on the problems that matter.
    *spreads hands* I don't have any fundamental objection to that idea. I'm just not entirely confident that the fringe-applications of such technologies can be so neatly cordoned off, or that market forces won't push strongly for GNR technologies. A friend of mine was saying recently that electric cars are now becoming market-viable because of advances in battery technology for the mobile phone market. If you can figure out how to use gene therapies to cure asthma, there's gotta be an awful lot of rich old people happy to blow their savings on repairing telomeres. I'm not saying it's right, but that's how the wind blows.
    Last edited by Carry2; 2012-11-24 at 07:08 PM.

  22. - Top - End - #112
    Ogre in the Playground
     
    RPGuru1331's Avatar

    Join Date
    Oct 2008

    Default Re: The Singularity

    It's not working well enough right now,
    True. It's also not being acted on right now, in general; the money is lacking in huge parts of the globe, and the will in less huge ones.

    I can't help but mention that the whole global-warming/climate change stuff is very contentious right now
    Not amongst the relevant experts. It's probably only slightly less entrenched than Germ Theory or Evolution.

    I also can't help but mention that the technology for cars to run 1000 miles on a tank of gas exists. Not to mention cars that run on water.
    These urban myths have been around before any of us were born. In that time, nobody has presented evidence that they are true.

    Yeah, but it's a bargain if it allows you to select a subset of her parents' genes statistically associated with a 20-point boost to IQ. (Hopefully without serious side-effects.)
    Um, no, that isn't a bargain at all. Your kid could still turn out lazy and waste it, and even if they're hard working could have terrible school systems. 120 IQ is nowhere near as efficient an investment as moving somewhere with good schools first and foremost.

    It's certainly fair to point out these technologies will initially be out of the reach of the most economically vulnerable,
    College is breaking the bank of the majority of americans right now, not just the ones beneath the poverty line, and college gets the benefit of savings funds and scholarships.


    Is *ANYTHING* not a problem of technology to you? You suggested that we just need better advances in agricultural technology to increase the carrying capacity of humans so that population growth could continue exponentially, ignoring everything we know about what drove the population explosion of the 20th century, where it's ending, and why, for instance.
    Last edited by RPGuru1331; 2012-11-24 at 07:18 PM.
    Asok: Shouldn't we actually be working?
    And then Asok was thrown out of the car.

  23. - Top - End - #113
    Titan in the Playground
     
    Ravens_cry's Avatar

    Join Date
    Sep 2008

    Default Re: The Singularity

    You can't run a car on water, not without some form of extra energy. Oh, you can break down water and burn the resulting hydrogen, but somewhere you have to pay the laws of thermodynamics, and there is no way to get more energy out of burning that hydrogen than it took to taking it apart.

    As for cars that get 1000 miles to the gallon, you could get something with four wheels and a motor that technically carried a person, but it wouldn't be much use as a car. It couldn't carry enough, provide enough protection, or go fast enough to be of much use.
    Tesla was a great inventor, but people give him way too much credit sometimes.
    He still had to obey the laws of physics the rest of us do.
    Quote Originally Posted by Calanon View Post
    Raven_Cry's comments often have the effects of a +5 Tome of Understanding

  24. - Top - End - #114
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    Do you not get tired of aping my expressions?
    But they're just so quotable!
    Nanotech, not nanite von neumann machines. Minor genetic alterations that are only somewhat more effective than crop husbandry, not gene tailoring. Dedicated, extremely limited AI, not sapient systems... ...what I'm hearing from you is more like "Guys, we need to figure out this running with scissors problem" to people who don't have steel.
    If I had reason to suspect that steel might be invented before my grandkids' time... yeah, I might be a little worried. (Also, I'd be inclined to give health warnings about substituting arsenic for tin.)
    And it took more than 3 billion years to get there. If you want to claim that kind of timescale for science fiction AI, I won't argue with you. But that's well beyond 'not an imminent problem'...
    ...Also, it's not strictly true that biological evolution focuses exclusively on practical applications. Most mutations are neutral; IE they neither help nor hinder the organism.
    Which is why our progress in these areas has proceeded on the scale of decades, not aeons. We don't have to rely on random mutations once per generation, but conscious experiment by full-time researchers. Will it take time? Certainly. But the timescale is short-term enough that I don't think tentative precautions are entirely out of place.
    I suspect, but as I'm not a biologist, can't confirm, that some immediately meaningless adaptations form a stepping stone for later useful ones.
    Umm... no. It's reasonably well-accepted that only biological adaptations with an immediate value to the organism are going to be selected for over time. It's one of the major weaknesses of biological 'design', as species sometimes get trapped in local minima that seriously inhibit long-term improvement of a given function. But, given enough niches, and enough time, it got there eventually.

    I'm not saying that pure/basic/fundamental research wouldn't speed up the process considerably. Goodness knows I'd vastly prefer an AI designed like LISP to one grown like C++. But the slow accretion of piecemeal features over time based on short-term demands, butt-ugly as the results can be, often proves surprisingly successful.

  25. - Top - End - #115
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    Um, no, that isn't a bargain at all. Your kid could still turn out lazy and waste it, and even if they're hard working could have terrible school systems. 120 IQ is nowhere near as efficient an investment as moving somewhere with good schools first and foremost.
    Nonsense. Even a relatively modest correlation between IQ and income, averaged over the probability of lifetime earnings, pays back a 10K investment many times over (particularly if the parents had an averaged IQ of, say, 80.) I'm not saying that motivation and personality don't contribute substantially to success and well-being, or that environment doesn't play a formative role. But motivation and personality may also have genetic components, and a healthy environment kind of depends on people being good at their jobs.
    College is breaking the bank of the majority of americans right now...
    Actually... I'm of the opinion that much of the backing behind our educational system can only be traceable to some kind of en-masse stockholm syndrome.
    Is *ANYTHING* not a problem of technology to you? You suggested that we just need better advances in agricultural technology to increase the carrying capacity of humans so that population growth could continue exponentially...
    Listen to me very carefully.
    * I am not arguing in moral favour of, or suggesting the probability of, exponential human population growth, nor do my approximate projections for technological development, such as they are, hinge in any way upon this assumption. This may be the fifth or sixth time I have had to make this clear to you, and it is seriously beginning to tax my limited patience.
    * I did suggest that even in the worst-case scenario of the earth's total ecological collapse, population figures comparable to today's or beyond could probably be maintained with existing technologies. I did not suggest that such a scenario was a socially desireable outcome, and indeed I would love to see it avoided. I like trees and frogs and kittens! Honest!
    * Please drive this into your skull: Technological progress can, and very likely will, continue to take place even with a constrained population, even if the fraction engaged in active research drops, and even if they are primarily focused on practical applications. It will not be as fast as in other scenarios, but knowledge will steadily accumulate.

  26. - Top - End - #116
    Ogre in the Playground
     
    RPGuru1331's Avatar

    Join Date
    Oct 2008

    Default Re: The Singularity

    Which is why our progress in these areas has proceeded on the scale of decades, not aeons.
    Given that we've been studying intelligence for more than a century, and still know very little about it, I think you're rather overselling humanity there, as well as our ability to quickly engineer solutions to problems we don't actually understand.

    If I had reason to suspect that steel might be invented before my grandkids' time...
    And I again don't disagree that concern for far-future problems is a positive trait. Treating them as much immediately dire, in the face of much more looming problems, though.

    Umm... no. It's reasonably well-accepted that only biological adaptations with an immediate value to the organism are going to be selected for over time.
    No, that isn't even a little true. A lot of things continue because they're not harmful. Whales still have hips, after a rather long time since they've been outmoded. And Climbing Mt. Improbable isn't really about that. Even if it were, showing that some common complex organs did in fact show immediate iterative improvements at each step doesn't demonstrate that every iterative improvement sprung from an iterative improvement, in addition. Falsification: Doesn't work that way.

    But the slow accretion of piecemeal features over time based on short-term demands, butt-ugly as the results can be, often proves surprisingly successful.
    Unsurprisingly, that wasn't a demonstration of how pure practical research actually generates an amazing outgrowth from prior work.
    Asok: Shouldn't we actually be working?
    And then Asok was thrown out of the car.

  27. - Top - End - #117
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by Seharvepernfan View Post
    The rich and powerful have always spent their time and money on things that don't benefit the rest of the world at all (or doesn't benefit it at the moment, anyway). I would be very surprised if there weren't rich people spending a lot of money/brains/time on creating AI/nanotech/genetic engineering/what-have-you right now (if not for years by now).

    Using current technology and practices to fix current problems just isn't going to work.
    While I agree that GNR are as likely to be playthings of the rich as they are to be applied to the problems of the poor, I would agree with RPGGuru that these problems could probably be technically solved with existing technologies and practices, if the political will were there. My (fairly conservative) projections for the adoption of genetics/nanotech/AI aren't prescriptive, but descriptive- I think this is what probably will happen, not that it ideally ought to. However, I would make the prescriptive assertion that any policy for handling G/N/R research, however well-intentioned, needs to be cognizant of the economic/psychological forces which makes their adoption likely.
    However, I find myself agreeing with Carry2 on the things he's said. (and you're very funny!)
    Thank you! Not sure about climate denial and illuminati stuff! But thank you!

  28. - Top - End - #118
    Ogre in the Playground
     
    RPGuru1331's Avatar

    Join Date
    Oct 2008

    Default Re: The Singularity

    Even a relatively modest correlation between IQ and income,
    When you're relying on The Bell Curve to make your argument, u r doin it wrong. And that doesn't establish that it's a bargain compared to proper schooling, which has a much stronger, and considerably better established, correlation; further, moving somewhere iwth a good school system is a benefit to all children, not one.

    Less flippantly, most of the findings you show do not show a very large correlation, and don't change that since it's done prior to birth, it's done at the point in a couple's life where they have the least to spend on that. There's also the problem that this will magnify structural inequalities, but that's a matter

    Actually... I'm of the opinion that much of the backing behind our educational system can only be traceable to some kind of en-masse stockholm syndrome.
    Oh lord. One is hokum. The other is a comic that lists a real problem, in exaggerated form*. It's irrelevant, however, to the point I made. Regardless of its quality, college is a very real barrier that helps keep people who can't afford it out of jobs that pay decently. And it's already expensive enough without paying 10k for a 1/16 or less chance at a relevant income increase.

    nor do my approximate projections for technological development, such as they are, hinge in any way upon this assumption.
    120 years to tepidly making possible phenotypic adjustments in yeast
    100 years from there to Gattaca babies
    Really. No assumption of exponential growth whatsoever.

    * Please drive this into your skull: Technological progress can, and very likely will, continue to take place even with a constrained population, even if the fraction engaged in active research drops, and even if they are primarily focused on practical applications. It will not be as fast as in other scenarios, but knowledge will steadily accumulate.
    Congratulations on repeating after me, I guess. I don't recall implying that these things would never, ever, ever be issues.

    And you've again missed that the point is, you suggested technological solutions to a human problem, and that you consistently show you will do so.

    *Though in classic SMBC fashion, continues denigrating the humanities. A rather large problem of nerds in general.
    Last edited by RPGuru1331; 2012-11-24 at 08:55 PM.
    Asok: Shouldn't we actually be working?
    And then Asok was thrown out of the car.

  29. - Top - End - #119
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    No, that isn't even a little true. A lot of things continue because they're not harmful.
    If the body produces it, it consumes resources, and that which consumes resources without utility is harmful to the organism. But I'm straining to see what this has to do with the question, since human researchers surely aren't less capable of speculative experiment.
    Unsurprisingly, that wasn't a demonstration of how pure practical research actually generates an amazing outgrowth from prior work.
    It doesn't. But my point is that even inelegant approaches tend to get there eventually. Remember: I'm giving a rough timescale of anywhere between a decade (if some peerless genius has a sudden epiphany under ideal research conditions) and a millennium (through brute force and ignorance, trial-and-error in some post-doomsday bunker) for sentient AI. I can't really rule out either possibility, but I'd prefer to cover my bets.

    Major financial corporations already use electronic stock trading to maximise revenues, and there's evidence this is contributing to long-term market instability. The algorithms involved aren't even all that sophisticated, and they're already capable of doing damage, given the parameters imposed by their makers. I'm scared by what might happen when and if some wingnut crams a bayseian-hybrid logic engine with the precepts of Hayek and tells it to fill the national coffers 20 years from now.

  30. - Top - End - #120
    Banned
    Join Date
    Oct 2008

    Default Re: The Singularity

    Quote Originally Posted by RPGuru1331 View Post
    Oh lord. One is hokum. The other is a comic that lists a real problem, in exaggerated form*. It's irrelevant, however, to the point I made. Regardless of its quality, college is a very real barrier that helps keep people who can't afford it out of jobs that pay decently. And it's already expensive enough without paying 10k for a 1/16 or less chance at a relevant income increase.
    Let us, very conservatively, assume that the average person makes 32K per year, works for 30 years, and that a 20 point IQ boost correlates to, on average, a 2K increase in salary. That works out to 60K extra in earnings, for a one-time cost of 10K at birth, for a kid that will likely cost you well upwards of 10K per year for 20 years regardless. And this is assuming that costs don't come down drastically by this point, or that robots haven't taken most of the jobs where IQ doesn't matter.

    Again, I realise that these financial costs aren't nothing at the moment, which is why I advocate state intervention to make these services universally available. Which you would know, if you weren't apparently reading somebody else's posts.
    120 years to tepidly making possible phenotypic adjustments in yeast...
    The human genome project wasn't even finished 10 years ago, and there's already a serious prospect, for better or worse, of eliminating Down's Syndrome. You need to extract your head from your ass.
    Last edited by Carry2; 2012-11-24 at 09:40 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •