HN Theater @HNTheaterMonth

The best talks and videos of Hacker News.

Hacker News Comments on
Welcome to Life: the singularity, ruined by lawyers

Tom Scott · Youtube · 16 HN points · 27 HN comments
HN Theater has aggregated all Hacker News stories and comments that mention Tom Scott's video "Welcome to Life: the singularity, ruined by lawyers".
Youtube Summary - Or: what you see when you die.

If you liked this, you may also enjoy two novels that provided inspiration for it: Jim Munroe's Everyone in Silico, where I first found the idea of a corporate-sponsored afterlife; and Rudy Rucker's trippy Postsingular, which introduced me to the horrifying idea of consciousness slums.
HN Theater Rankings

Hacker News Stories and Comments

All the comments and stories posted to Hacker News that reference this video.
> One day a robot is going to be walking around the world. Will it have to pay someone every time it glimpses a video / book / logo / car design / etc?

Looking at all of the people happily deciding to interpret copyright as widely as possible is kinda horrifying in this context:

> Your stored mind contains sections from 124,564 copyrighted works. In order to continue remembering these copyrighted works, a licensing fee of $18,000 per month is required.

> Would you like to continue remembering these works?

> [you have insufficient funds to pay this licencing fee]

> Thank you. Please stand by.

> [Copyrighted works are being deleted]

> Welcome to Life. Do you wish to continue?

Yes exactly :)

Somehow I trust the law is going to work out ok though, despite all the hot takes from people who don’t really understand ML or copyright.

It’s scary to see so many people not understanding the distinction between an input and an output.

Oct 18, 2022 · natch on AI Data Laundering
So, an AI Karen?

>the generative AI system censors any output containing a category of error

I can see the ruleset now…

"This picture of a woman is revealing hair, so it must be censored because it is objectionable to some people and we must respect all people whose beliefs are guided in a sanctioned way."

"This picture shows unadulterated fun, which must be censored because…"


Or more to the point, “this picture contains copyrighted material, which must be censored because…”


I tried to be as general as possible.

The training data for a self-censorship neural network could be as robust as any given society would like.

An algorithm based on self-censorship of generated output wouldn’t require censorship within the training data for the generative neural network.

I can imagine some other advantages to that approach.

But literally (and I use that word literally) none of the pictures contain copyrighted material.
I don't know how people can make these strong statements about anything in law.

Disney have won cases in court were some artist has drawn their own version of Mickey Mouse, similarly try writing a story about some kids in a wizard school and you need to be extremely careful not to violate (or at least get taken to court) for Harry Potters copyright.

I'm pretty certain image production models have produced some images which would very likely to be judged to violate copyright (a much less strong statement).

You are confusing copyright with trademark. Or, provide a link showing the images case was decided on copyright issues, and I’ll reconsider my position.
Sep 02, 2022 · gpvos on Xkcd: Universe Price Tiers
Relevant Tom Scott video: Welcome to Life.
> They will keep changing the narrative ad infinitum until everyone is dependent on them at birth.

Welcome to life:

Jul 18, 2022 · jerf on Crimes against transhumanity
"The new feudal aristocracy would be the ones running the demons."

The question of ownership is huge and I think still underaddressed, even after some good work like the referenced Lena, or Tom Scott's "Welcome to Life: the singularity, ruined by lawyers" .

I was realizing a couple of weeks ago that if you do believe all the materialistic things cstross outlines, that a solid argument can be made that it may be never rational to upload your mind, on the ground that there is simply no circumstance I can imagine where you will "own" your own substrate. Arguably, owning your computational substrate is a fundamental aspect of life that we take for granted today. (Or, if you won't consider what you have now "ownership", at least nobody else does either.)

But there is no circumstance in the forseeable future in which anyone can own their substrate. For a while I thought a rich person could do it, but then I thought about the entire supply chain involved in creating any brain scanner and subsequent computational device, and the amount of hardware left out in the real world where others can affect it, and I realized, you just can never assume that you are doing anything other than flinging yourself irrevocably into a locked box that is not under your control. Even if a rich person thinks they funded the entire project, brand new software, brand new hardware, there are thousands if not millions of points where either mistakes or deliberate sabotage for control may have been done. How can they be sure? All mechanisms can be corrupted, and the interests are certainly there to do so, massively so!

Everything people do today to control you; advertising, censorship, social pressures, everything, will only be amplified and combined with new techniques if you are running on human-comprehensible hardware that can be affected by any intelligence organization, anybody in the supply chain with an axe to grind, etc.

I seriously can not think of what kind of assurances it would take to fling yourself into that. About the best you can do is hope that the math works out that it's better to keep you happy and it's not sadists pulling your strings but at least while you are working for someone else's interests they don't inflict massive amounts of pain on you just for fun. And the edited, redacted, sanitized, loyal version of yourself that happens to survive at least enjoys their servitude.

In the 1980s and 1990s, it was easy to at least imagine that maybe you'd own your own hardware. Today we can't even keep ownership and control of our mobile phones. How do you expect to have any ownership of something that makes your cell phone look like Dr. Nim?

So, there's my proposed answer to cstross: Some form of absolute right of ownership of computational substrate for all sentient beings, with the responsibility of providing that to others being absolute, even if that means some technology becomes impractical or infeasible, even at great cost. There's a prisoner's dilemma aspect to this; obviously everybody wants ownership of their own substate but the rewards for defecting can be enormous... at least in the short term.

"any more than the life of the reader of a novel becomes a derived work of its story."

I fear that is only a limitation of our current tech, rather than a bedrock foundation of our law: (see around 2:00, though the whole video is necessary for context)

With current trends of surveillance capitalism and nickle-and-diming for every single useful feature, I can only imagine mind-uploading getting closer to an eternal torture than a pleasant afterlife.

Some interesting resources that explore this view, which also happen to fill me with existential horror at the prospect of being uploaded (especially against my will):

Tom Scott's video "Welcome to Life: the singularity, ruined by lawyers":

qntm's short story "Lena":

I can especially recommend "Lena". It's from the 'I have no mouth and I must scream' genre of horror.

Tom Scott imagined something similar 8 years ago, see
Tom Scott's speculative fiction about the end game of a DRM-based future:

"Welcome to Life"

"The Artificial Intelligence That Deleted A Century" is another similar one from Tom.

You are basically describing this old (2012) video, where this concept is applied to a brain-upload afterlife:
Brain upload afterlife will be a lot more like
This reminds me of Tom Scott's "Welcome to Life: the singularity, ruined by lawyers" video [1]


This was fantastic.

When I worked on Microsoft's DRM products in the early 00s I came to the sudden realization that if we had AR glasses in the future they could block out things from our vision and replace them with adverts. Ugh. It will come true.

Also see the other video Tom Scott made on this topic:
Fantastic (and scary), thank you.

These media fingerprinting databases are truly dystopian. I have come across a wide range of media recently that is being erased from our culture this way. The media giants say "don't pirate it, you can buy/rent it from us". But - and this is a big but - what if you can't rent/buy it? What if the media giant just adds the content's fingerprint to the database but then erases or locks away all copies of it.

Now you cannot upload it to any site with any serious audience. I have videos that the networks have locked away, but yet I cannot upload them to Youtube/Vimeo or anything similar because they are flagged. I cannot self-host them because the bandwidth requirements would kill me. Sure, they probably exist on places like Freenet, but that is essentially inaccessible for most mortals.

Once all the people that know of these things die off we'll only be left with whispers on the Web where such things are mentioned as having once existed, but can never be seen.

Never heard of a piratebox before?
I have now ;)
An interesting example: CBS pulled Star Trek from Netflix in Germany, but because they licensed Paramount+ to Sky, which launches next year, now some of the show's aren't available at all.
Yep, another example that I was thinking of. There is no way for you to get that content legally right now, although it will be available in the future (allegedly).
Did you mean this:

The child's doodles make it seem like a lighthearted thing. It is not.

A grownup is attempting to explain some fairly dark things to a very young child. Mercifully, the child cannot absorb what it is being told. You may not be so lucky. But there is some biting social commentary and the absurdity of the whole thing makes it work.

There is certainly the disturbing possibility that "Welcome to Life" [ ] will rapidly be looking less and less like satire.
For videos, this should work... added to your .bashrc file, which then needs to be reloaded (only once) by typing source .bashrc in your home directory from the command line:

    alias yt='youtube-dl --recode-video mp4'
On my system I have a funky python setup so I need to use this instead:

    alias yt='unset PYTHONPATH; youtube-dl --recode-video mp4'
I don't see the need for interactivity because the only format I ever want is mp4.

For audio, I would just do a separate alias with a different name. For example (edit: tested):

    alias yta='youtube-dl --extract-audio --audio-format mp3'
Invoke each with the URL. For example:



If a youtube URL has extra parameters separated by &, delete those before invoking. For example the URL below, which has extra parameters because it came from a playlist:
should be changed to this before using it to attempt a download:
Or you can use the full original URL but you need to protect it from interpretation by UNIX by surrounding it with single quotes when invoking.

    yta ''
> I don't see the need for interactivity because the only format I ever want is mp4.

Then this tool isn't for you. If you're gonna skip the whole point of the tool, you can't claim your code suggestions are equivalent.

True! I should have made it more clear my examples weren't providing interactivity, I guess you're saying. OK.
Apr 29, 2019 · 2 points, 0 comments · submitted by CraneWorm

(This is a relevant YouTube video to the discussion, not spam.)

See Tom Scott’s Welcome to Life. [0]


If you were offered the chance to become as intelligent as Feynman but only if you turn over a copy of all your future thoughts to the private business that made this possible, would you take it? Something non-invasive, like a computer that images your brain while you sleep.

If we are machines, then it seems reasonable that within a few millennia we might have this capability. I'm trying to think up some scenarios where it might be reasonable to turn over your brain to a business.

The ability to become "immortal" would probably be enough for most people. Is your disembodied mind still you? Do you feel an attachment to the idea of preserving it? What about modifying it?

Where it gets really strange is if you think of your wife or husband making a backup of their brain, then running a simulator on that backup. I.e. giving it "life" by letting the simulated neurons fire. Does that count as thinking? What if you can see those thoughts? What if you can communicate? Would you love them just as much as you love your SO?

Probably useless questions, but I can't help but wonder.

(I've expanded my comment since it was posted; the "no"'s are in response to the first question.)


> If you were offered the chance to become as intelligent as Feynman but only if you turn over a copy of all your future thoughts to the private business that made this possible, would you take it?

"And, my students, if we equate both expressions, then, ... ahem, by the way, do you know how easy it is to get a brain extension from Acme Corp?"

Honestly, I think it's more worth focusing on the near term ethical problems of such technology rather than the far off ones. So let's say that this project does pan out, and three years down the road we determine a programmer with one of these neural interfaces installed can open 20 terminal windows at the same time. With a special set of emacs/vim bindings they can bang out code 100x faster than regular programmers. So now companies are sponsoring their programmers to get the surgery.

Sounds like a pretty good deal right? Well in three-five years time, surgery probably won't change much. Installing such devices is still going to be pretty risky. And even then, having the hardware in introduces its own medical risks. Our super-programmers can't ever go swimming because of the risks of the fiberoptic and power jack plug going through their skull getting infected. This is a problem current medical implants have, so probably one we will still have in a couple years time.

There are a whole host of other problems with this too. IE what do you do when the hardware in your head is obsolete? Can the company that paid to put it in fire you because you aren't keeping up, in part because you have obsolete hardware in your head?


Deaf people today face a similar choice: getting a cochlear implant, you don't control the software (afaik). This is a significant part of why I still have no cochlear implant. If we want freedom of thought in the glorious transhuman future, well, that could evolve out of what's happening today, and we ought to value human autonomy more in today's systems.

DRM is another such arena: as I said in another thread the other day, do you want it enforced by your brain implant? No? Then think about how you want to allow the issue to get framed.

Ditto for compelled decryption.

The question you should ask is would Feynman have agreed to it?

Because that's the person who will have to put up with it. You can only ask such a question if you're willing to give the future recipient of the gift the option to back out and be sent back to your present state. And if that new individual isn't 'you' in a legal sense you might find that contract nul and void anyway.

> Where it gets really strange is if you think of your wife or husband making a backup of their brain, then running a simulator on that backup.

There is a Black Mirror episode around that called 'Be right back'.

>There is a Black Mirror episode around that called 'Be right back'.

Well, they were approximations based on people's social media presence, and the episode's conflict mainly came from the imperfection of the approximation. I think you get a very different set of cultural conflicts if the simulation is of an accurate copy of the mind (issues more like in the book Permutation City).

makes me this of the original star trek which I have been rewatching recently. in the first episode [0] another race is capable of causing hallucinations and making humans experience anything they desire. later on you find out that the captain (not kirk in the very first episode) is found by kirk and now can only say yes or no through an version of Stephen Hawkings chair. Spock knowing that his brain is still fully operational decides through loyalty to Captain Pike to bring him back to this alien race so he can enjoy living in this fake world and not be in the handicapped state that he is. the episode closes with the Captain Kirk showing interest in wonder at the opportunity provided by this race.

I suspect given the opportunity many would take this. ( without knowing any of the consequences I tentatively would certainly take it in that situation)



The Gentle Seduction is a short story that you might find interesting.

What troubles me is not the permission to read, but to write. While potentially useful for a host of medical applications, this is a scientifically plausible proposal to create technology that will literally have the ability to control minds. While you might be comfortable with sharing intellectual property with a business entity, giving that same entity the ability to adjust internal motivation, or alter sensory input is a dangerous Faustian bargain. How much would you trade for free will?
> What troubles me is not the permission to read, but to write.

Argh, I can just imagine having extremely realistic advertisements beamed directly into my mind, in exchange for something like immediate translation.

It's gonna be really weird when those perfume ads start before Christmas.

> If you were offered the chance to become as intelligent as Feynman but only if you turn over a copy of all your future thoughts to the private business that made this possible, would you take it? Something non-invasive, like a computer that images your brain while you sleep.

No. If you were offered wings so you could fly, but in exchange you must spend rest of your life in a cage, would you take it?

Would it then become important for that business to give these minds that would not make them feel trapped and caged? Wouldn't the quality of results be hampered if lots of suffering were to occur? Could the mind at some point become useless since it would be drowned in depressive, cyclical, and often illogical thought? That, too, would create a research point, but would be pretty inhumane and I would hope an ethics oversight panel would not allow such states to exist.

So many questions surround this... If the body is dead, but the brain is still active of reasonable thought and to that end communication, who decides it's fate? Does a mind deteriorate once there is no outside stimuli of the senses?

Not that I would agree to be in this state, but man there would be huge fields of research to do.

Would it then become important for that business to give these minds that would not make them feel trapped and caged?

Sounds like a more reasonable basis for the genesis of the Matrix than what was portrayed in the movie. :-)

There's a Black Mirror episode about this, but saying which episode it is would be a big spoiler. Just watch the whole series if you haven't.
Honestly it would depend on the size of the cage.
2x2x2 meter cage.
2x2x2 AU would still be too small.
It further depends on the amenities on offer, and the bird's size.

Hell, folks (including me) want to live on a Mars base, where "living in a cage" is probably an apt description of life for at least the first couple generations.

I'd sure as hell hand over a copy of my brain to a corporation in exchange for the ability to back up my brain and functionally live forever.

You missed the part I was replying to, when I replied it was the only content of the parent post. My analogy is about deal where you as a human get wings (ability to fly), but have to spend the rest of your life in a cage.
A more interesting question is, what if you wake up one day as one of 10 copies of yourself?

Would you today be worried about preserving the life of all 10 clones (with memories)?

We are all afraid to die - but what if tomorrow there were clones with the memories you have today?

The question basically is posed here

The movie The Prestige deals with the concept of self copies too.

From the moment those copies exist, I'm not them and they're not me. We do not share subjective experiences. If you think we're equivalent and expendable, you'd best murder me in secret and pretend it never happened. You know, like incinerating me inside a teleporter or someth... oh crap, we're going there.

Yeah, I have a few issues with that comic.

> The man was not a murderer.

This is one of a few logical leaps I can't accept. His past selves are already "dead". His future selves do not yet exist. If, for example, all my ancestors are dead and my offspring not yet conceived, how does my suicide harm them in any way? None will so much as feel the grief of losing me, but I'm supposed to feel guilty over "murdering" them? (Let's not follow up on what this would mean for topics such as abortion!)

If future and past selves are of such importance that we not murder them, how can present copies be valued any less? If destroying the original is not necessary, then that must surely be murder as well. It's worse, actually, because then it's not even hypothetical; "The Machine" would be intentionally and unnecessarily ending a life.

Why do I possess the subjective experience of me, not you, and not some other me separated by time or space? Barring notions of a soul, either those other instances are not real in some sense or else there's something phenomenologically special about each, which I can only attribute to being uniquely present at a particular point in space-time. If the teleported me is a clone, their existence ought to have nothing to do with me, thus I should not be murdered for their sake. However, if the teleported me instead can only exist as a function of my past-self ceasing to be, then that's little different than how I am from any moment to the next.

TL;DR: If it works like the movie version of "The Prestige" where one copy is murdered purely for convenience, I'm not stepping into that teleporter.

I really wonder how valuable would augmented intelligence be , when AI will be so much smarter.

If it would be the same as physical strength today,maybe most people won't bother with that too much, and instead focus on other things,maybe more core to feeling well.

One of those things could be the experiences meditation practices talking about, enlightenment etc, and it seems that we won't need neuron level sensing to do that, but something more at the affordable fmri level or maybe eeg+localization, and those seem to be here much before intelligence augemntation.

So I wonder if we'll ever see intelligence augmentation.

We basically already have intelligence augmentation; it's just in the form of computers and smartphones.

Interestingly, despite widespread adoption of both, there's still some people "smarter" (i.e. better at certain tasks) than others because they can better apply the tools available.

All extrapolated technological symbolism that, when experienced as reality, will fail to stand up to the actual non-symbolic experience of existence.
what happens when someone copies a simulation of you and tortures it digitally to get all your passwords and information?
Just make sure not a forget a very important phrase: ignorance is bliss.
This this this... being intelligent will not make you happy.

I wouldn't take hyperintelligence for free.

People already essentially turn their brains over to businesses today, without such grand promises of intelligence or immortality.

Its actually maybe even a pretty accurate description of religion

Religion does make a grand promise of immortality.
which is what I said
To be fair, only some do. Others just assume endless reincarnation is a thing and promise escape. Whether you see that as real distinction is something else entirely.
yes I would download all the knowledge I could into offline storage and only connect for updates.
Reminds me too much of Black Mirror
Somebody needs to figure out why these people keep thinking a clone of you is you. It keeps happening over and over again. What kind of primate shit is this that won't fucking go away.
I think we should really fix society before we develop that kind of technology. We need a socio-cultural breaktrough of the same magnitude as the technological breakthrough. Something on par with the development of monotheism, enlightenment, capitalism, or democracy. A "communistic" utopia (for lack of a better word) without the flaws of its historical implementations. Then the question is moot. There is no private business, or indeed any other entity, government or private, that you would make such a deal with. There is no second party that wants to withhold technology, or steal your thoughts. Rather it will be "From each according to his ability, to each according to his needs" - you get the augmentation for free, and you contribute what you feel comfortable sharing back to society.

Often, utopia is portrayed as the result of a technolocial breakthrough, sometimes "if we could read each others thoughts there would be no misunderstanding and conflict". I believe it is the other way around - utopia is a precondition for singularity-type technology. At least, if we want it to be beneficial and not hellish.

Brain implants are easy, communism is going to be hard.

The other, and much more likely in my opinion, future is that developing this kind of technology will cause us to "fix" our society, or rather society will grow to support the new technology. It is unlikely that private business will relinquish it's control willingly, we must aim for a future that makes private business irrelevant (massive automation removing the need for humans to "pay" for anything)
I'm curious why you worry about control by business, but not control by the government?
I don't really see those as different entities. Governments are just businesses with a different name. I pay them for a service.
My government, in theory, derives every last bit if its power from my consent and the consent of my peers; and in so far practice diverges from that, that can be corrected or at least discussed. Even foreign governments at least have that relation with someone, even if they don't have it with me. With private enterprises, there isn't even anything to correct or discuss from their perspective. They're "pure tyrannies", as Chomsky put it, while governments are theoretically democratic. The magic free market turning selfish bastards fighting each other instead of entropy into some useful force doesn't seem to be enough just by itself.

Though now that I think of it, if people got their shit together in the ways required to kick their own governments in the dick where required, that would also translate to paying attention with how we vote with our wallets (it also would translate to any corp trying just one line of cutesy marketing-speak of sinking to the bottom before you can say iceberg, which would improve the world by orders of magnitude within one week), so in the end maybe "capitalism or communism", or anything of that nature, matters much less than the individuals do... after all you can't polish turds, and you don't need to array gold nicely for it to shine.

The rookie cop his first day on the job has more power over you than Bill Gates does.
When someone misses a train because of some forced Windows update, they just call the police and they force Bill Gates to fix that right up.

How about this instead, just imagine I'm so dumb that you have to spell your point out fully. I don't get what you're trying to say, what rookie cop? A specific one, or "the" rookie cop, every single one? And who is Bill Gates? If it's not worth spelling it out for you it's not worth trying to speculate for me.

After I pointed out I don't believe in the dichotomy, the reply is something that apparently is supposed to be an argument for it. But you don't have a fixed amount of resources you spend on either being afraid of the government or corporations, and any amount of reclaiming your dignity as a critically thinking and responsibly acting human person will help with both "threats".

To not even acknowledge that, and go "but government", or "but corporations", really means at this point I need you to rephrase my whole comment in your own words to believe you did more than scan it. (That goes for the person downvoting me asking you to elaborate, too. If it's soooo obvious to you, make with the goods.)

> communism is going to be hard.

Not if everyone has brain implants!

Touché. That's why I'm not getting a brain implant :-D
"if we could read each others thoughts there would be no misunderstanding and conflict"

For that to be true you have to assume that everyone has enough empathy to come to agreement. It isn't really much different from the way things are now (without being able to read each other's thoughts).

You've got it backwards. Our brains are what is holding society back.

We need a socio-cultural breaktrough of the same magnitude as the technological breakthrough

That's what transhumanism is. It's a restructuring of the human system around symbiosis with AI.

We are somewhat offtopic, but: That sounds a bit like the church in the middle ages saying: "your sins are what is holding society back, that is why you can't have nice things". Or like many utopians, who wanted to create a "better man". Or their detractors, who said utopia doesn't work because people are flawed.

I don't want to change people, I want to change society.

I don't want to change people, I want to change society.

Somehow you think those are different things and I'm not sure why. Society only changes when enough people individually change. You can force change through government or coercion, but it's temporary and unsustainable.

utopia doesn't work because people are flawed.

Right, I'd agree with that.

What I mean is, I don't want Utopia for some kind of Adornoian "liberated human" or socialist übermensch, I want an Utopia in which I myself can live. Despite my flawed socialisation and biology. I think we can refactor society, instead of reforming or revolutionizing it. Keep the same people, don't coerce them into anything. Don't wait for future technological improvements, but reap the benefits of centuries of previous improvements. We have been technically able to satisfy the needs of everybody from a purely material point of view for decades now. Now we need to fix the allocation of these goods, and the crisis of political representation (or the lack thereof for many people).
I personally think that is impossible. It would be like training a rat to sing opera. We don't fundamentally have the hardware that would allow it.

We have been technically able to satisfy the needs of everybody from a purely material point of view for decades now.

"Earth provides enough to satisfy every man's need but not every man's greed" - Gandhi

"It's a restructuring of the human system around symbiosis with AI".

This sounds more like to basically have a socio-cultural breakthrough, you want a technological entity (some AI or whatever it turns out to be in the future) to control and direct human socio-cultural thoughts and execution. Transhumanism seems to be more oriented towards equipping humans to be interfaced(through some HCI - human computer interface) with computers around an AI, and other human built accessories (atleast for now) to enhance their mental abilities to project them as more powerful humans. One question is, how much do we actually know the power of our brain and abilities, if its properly used rather than enhancing with computers etc., Yes, there are great use-cases for HCI. But giving control to an AI to make your decisions or control humans in the name of technological break through is absurd.

you want a technological entity (some AI or whatever it turns out to be in the future) to control and direct human socio-cultural thoughts and execution.

Yes exactly!

Transhumanism seems to be more oriented towards equipping humans to be interfaced(through some HCI - human computer interface) with computers around an AI, and other human built accessories (atleast for now) to enhance their mental abilities to project them as more powerful humans.

It's currently sold that way, yes, but I don't see it that way. I envision it more like what I have been told/read about is the "Borg" from Star Trek (admittedly I've never seen it).

But giving control to an AI to make your decisions or control humans in the name of technological break through is absurd.

As absurd now as walking around on the moon was in the 1st Century.

"As absurd now as walking around on the moon was in the 1st Century."

Yes true(wonder why we don't shuttle down to the moon since we landed half a century ago and still never went back???), it is going to be absurd till we have an AI which is as capable(or even half as capable) as much as a human and we can debug that AI or maybe that AI debug's itself reflexively. Till then, I would say its absurd anytime to think that an AI could take control of a human and guide the human in its actions. Till then, I will keep my brain intact and keep away from any AI implants or any kind of implant altogether).

I don't think we need superhuman AGI to start having AI guide human actions. We already let AI guide us in daily life all the time, navigation, purchasing recommendations etc...

It's not a question of IF we will have AI direct our behavior, it's just a question of scale.

I understand, we use AI in one form or another in our daily lives. But for very specific and specialized narrow cases. As you mentioned, its that scale that matters which our brain does seamlessly and us as the generation with AI builders around, we are no where anything to perform in scale with an AI except very specialized tasks. It naturally will improve over time, but AI attaining AGI status and maybe taking control over socio-economic or socio-cultural changes would be a long shot unless we humans don't destroy ourselves by destroying life on earth.
Mass starvation and large-scale 'cleansings' will probably be a hard sell if the only upside is "Everyone can read your mind".
i don't think any technology should be put on hold because society isn't perfect especially when said technology has the potential to improve society immensely.
Communism has nothing to do with technological singularity. Saying that we have to reach communist utopia before we can have Singularity is about as useful as saying that we have to user in a global Caliphate or wait for Jesus to return first.
I agree. Once we develop the ability to upload and modify brains, it will be easy to morph them. We can extract the primal desire to defend yourself, to become angry, to want to own property. These emotions will be reserved for the founders.

It will take time to get there. Many will resist, so we have to keep our goal a secret. But people are gullible. With our morphogenic brain technology, we can give people experiences they've only dreamed of. Think of it. The high of heroin, with none of the downsides. The ability to know instantly when any of your loved ones are in distress. The ability to shape yourself and your children into any form you desire.

If we appeal to base emotions, if we appeal to their need to control, to own, to shape, then we can implement our plan.

> Many will resist, so we have to keep our goal a secret.

This is not democracy. Every dictatorship begins with "I know better what is good for you than you do".

You should watch movie "Equilibrium" with Christian Bale. I don't think you really want that, it only sounds good in theory.

Well, it was firmly tongue in cheek. :) I've been thinking of trying to write some short stories pulling from various themes in technology. Most ideas have been done to death, but with a bit of skill it might be possible to write something worth reading.

A tangent, but: I've been wondering how a novelist builds their skill. They're not born with it. One idea would be to watch a movie and write it like a book, transcribing it scene for scene. I'm not sure whether that'd help, though.

Re tangent: Steven King's "on writing" or Ray Bradbury's "en in the art of writing"

write 1000 words a day, be crap for decades, eventually you get skilled.

No ideas about the second, but an aside on the first.

'Most ideas have been done to death' is a central reason I have stopped reading novels that much. Ah its one of those, he will do this. Ah yes, I see the twist. its one of those.

I highly empathize with this. These days I'm just interested in imagination. I don't care if a book or novel or movie is good. I will most likely never consume it. I'm looking up synopses and scanning them for innovative ideas.
We are the borg,we want to turn your head into svedish furniture-futures.
I hope you're being 100% sarcastic.

If you're not, I must ask: is your name Andrew Ryan perchance? And I'll add this text just to say this is very relevant to what you said and not just a drive-by spam drop.
Mar 13, 2017 · pdkl95 on Update: CRISPR
While originally written with a focus on "intellectual property" and predatory EULAs, Tom Scott's terrifyingly-plausible Welcome to Life: the singularity, ruined by lawyers[1] becomes even more frightening when adapted to your "born-into-debt" scenario.


Feb 09, 2017 · 3 points, 0 comments · submitted by obi1kenobi
Tom Scott did an excellent Welcome to the (After)life video:
So, my concern about uploading my brain, is whose cloud service do you trust to run your consciousness? Google? Microsoft? Amazon? Facebook? Apple?

The level of trust that I'd require is pretty high. You could imagine it once again being relevant to check the pedigree of a company. Ideally there'd be a cloud provider already running now that is known for its trustworthiness. 'established 2013' might actually mean something one day.

My main hope is for indistinguishability encryption to reach the level where I wouldn't really have to trust the provider, but as long as you pay a significant performance penalty you're going to end up massively disadvantaged.

Jun 22, 2015 · 1 points, 0 comments · submitted by minionslave
Apr 08, 2015 · spiritplumber on It's not 1999
A lot of people who post on my facebook wall seem to think that it's the beginning of the Singularity. I hope they're wrong if it looks like this.

Have you explained what poor suckers they are?
No, why would I?
Mar 20, 2015 · BuildTheRobots on Death Redesigned
Slightly off topic, but I find it impossible to digest the phrase "Death Redesigned" without thinking of Tom Scott's Singularity:
Of course, the question then of who to trust to be your cloud provider running your conciousness-as-a-service process becomes a very serious one.

The main problem is the lawyers.

Thanks for that link.

If there's one service I don't want to be freemium....

Though, we might also wind up in the scenario depicted in the satirical three minute short, "Welcome to Life"..

May 24, 2012 · 1 points, 0 comments · submitted by JVIDEL
May 11, 2012 · 9 points, 2 comments · submitted by llambda
Most terrifying thing I've seen all year.
This is the probable outcome, unless we do something differently and better. This is why open source matters. This is why I understand someone using the latest GPL. Getting your product out to as many users as possible may be important, but preventing something like this, to me, will always take priority.
HN Theater is an independent project and is not operated by Y Combinator or any of the video hosting platforms linked to on this site.
~ yaj@
;laksdfhjdhksalkfj more things ~ Privacy Policy ~
Lorem ipsum dolor sit amet, consectetur adipisicing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.