The muscular imagination of Iain M. Banks: a future you might want
231 points
28 days ago
| 23 comments
| robinsloan.com
| HN
A_D_E_P_T
28 days ago
[-]
You might want to live there, but I wouldn't. Virtually all humans in the books -- and I'm aware of the fact that they're not Earth humans but a wide variety of humanoid aliens -- are kept as pets by the ships, for amusement, basically as clowns. Everything important about the flow of human life is decided by the mighty ship minds; humans are left to nibble at the margins and dance to the tune of their betters. There are a small subset of elites, in organizations like Special Circumstances, that are granted a modicum of independent agency, but even this is rather difficult to justify under the circumstances.

Most of the drama in the books comes to pass when the ship-dominated Culture interacts with a "backwards and benighted," but still vital and expansionist, species.

It's just not a human future. It's a contrived future where humans are ruled by benign Gods. I suppose that for some people this would be a kind of heaven. For others, though...

In a way it's a sort of anti-Romanticism, I guess.

reply
austinl
28 days ago
[-]
Banks' work assumes that AI exceeding human capabilities is inevitable, and the series explores how people might find meaning in life when ultimately everything can be done better by machines. For example, the protagonist in Player of Games gets enjoyment from playing board games, despite knowing that AI can win in every circumstance.

For all of the apocalyptic AI sci-fi that's out there , Banks' work stands out as a positive outcome for humanity (if you accept that AI acceleration is inevitable).

But I also think Banks is sympathetic to your viewpoint. For example, Horza, the protagonist in the first novel, Consider Phlebas, is notably anti-Culture. Horza sees the Culture as hedonists who are unable to take anything seriously, whose actions are ultimately meaningless without spiritual motivation. I think these were the questions that Banks was trying to raise.

reply
elihu
28 days ago
[-]
I suppose its ainteresting that in the Culture, human intelligence and artificial intelligence are consistently kept separate and distinct, even when it becomes possible to perfectly record a person's consciousness and execute it without a body within a virtual environment.

One could imagine Banks could have described Minds whose consciousness was originally derived from a human's, but extended beyond recognition with processing capabilities far in excess of what our biological brains can do. I guess as a story it's more believable that an AI could be what we'd call moral and good if it's explicitly non-human. Giving any human the kind of power and authority that a Mind has sounds like a recipe for disaster.

reply
idiotsecant
28 days ago
[-]
https://theculture.fandom.com/wiki/Gzilt

Banks did consider this. The Gzilt were a quite powerful race who had no AI. Instead they emulated groups of biological intelligences on faster hardware, in a sort of group mind type machine.

reply
theptip
28 days ago
[-]
Yes, the problem is that from a narrative perspective a story about post-humans would be neither relatable nor comprehensible.

Personally I think the transhumanist evolution is a much more likely positive outcome than “humans stick around and befriend AIs”, of all the potential positive AGI scenarios.

Some sort of Renunciation (Butlerian Jihad, and/or totalitarian ban on genetic engineering) is the other big one, but it seems you’d need a near miss like Skynet or Dune’s timelines to get everybody to sign up to such a drastic Renunciation, and that is probably quite apocalyptic, so maybe doesn’t count as a “positive outcome”.

reply
tialaramex
28 days ago
[-]
I don't see why post-humans can't be relatable even though they'd be very distant from our motivations.

Take Greg Egan's "Glory". I don't think we're told the Amalgam citizens in the story are in some sense human descendants but it seems reasonable to presume so. Our motives aren't quite like theirs, I don't think any living human would make those choices, but I have feelings about them anyway.

reply
theptip
27 days ago
[-]
I haven’t read that one, will check it out. If we take his “Permutation City”, I think the character Peer is quite unrelatable, and only then because they give some human background. A story consisting only of creatures hacking their own reward functions makes motivations more alien than “not quite like ours” IMO.

I assume post-humans will be smarter and unlock new forms of cognition. For example BCI to connect directly to the Internet or other brains seems plausible. So in the same way that a blind person cannot relate to a sighted person on visual art, or an IQ 75 person is unlikely to be able to relate to an IQ 150 person on the elegance of some complex mathematical theorem, I assume there will be equivalent barriers.

But I think the first point around motivation hacking is the crux for me. I would assume post-humans will fundamentally change their desires (indeed I believe that conditional on there being far more technologically advanced post-humans, they almost certainly _must_ have removed much of the ape-mind, lest it force them into conflict with existential stakes.)

reply
stoneforger
28 days ago
[-]
The Meatfucker acts as a vigilante and is unpopular because of the privacy invasions. The Zetetic Elench splintered off. The Culture's morals were tested in the Idiran war. They might not have greed as a driver because it's unnecessary but they do have freedom of choice so they're not exactly saints.
reply
akira2501
28 days ago
[-]
> AI exceeding human capabilities is inevitable

It can right now. This isn't the problem. The problem is the power budget and efficiency curve. "Self-contained power efficient AI with a long lasting power source" is actually several very difficult and entropy averse problems all rolled into one.

It's almost as if all the evolutionary challenges that make humans what we are will also have to be solved for this future to be remotely realizable. In which case, it's just a new form of species competition, between one species with sexual dimorphism and differentiation and one without. I know what I'd bet on.

reply
adriand
28 days ago
[-]
> the series explores how people might find meaning in life when ultimately everything can be done better by machines.

Your comment reminds me of Nick Land's accelerationism theory, summarized here as follows:

> "The most essential point of Land’s philosophy is the identity of capitalism and artificial intelligence: they are one and the same thing apprehended from different temporal vantage points. What we understand as a market based economy is the chaotic adolescence of a future AI superintelligence," writes the author of the analysis. "According to Land, the true protagonist of history is not humanity but the capitalist system of which humans are just components. Cutting humans out of the techno-economic loop entirely will result in massive productivity gains for the system itself." [1]

Personally, I question whether the future holds any particular difference for the qualitative human experience. It seems to me that once a certain degree of material comfort is attained, coupled with basic freedoms of expression/religion/association/etc., then life is just what life is. Having great power or great wealth or great influence or great artistry is really just the same-old, same-old, over and over again. Capitalism already runs my life, is capitalism run by AIs any different?

1: https://latecomermag.com/article/a-brief-history-of-accelera...

reply
Vecr
28 days ago
[-]
Or Robin Hanson, a professional economist and kind of a Nick Land lite, who's published more recently. That's where the carbon robots expanding at 1/3rd the speed of light comes from.
reply
BriggyDwiggs42
28 days ago
[-]
I just want to add that I think you might be missing an component of that optimal life idea. We often neglect to consider that in order to exercise freedom, one must have time in which to choose freely. I’d argue that a great deal of leisure, if not the complete abolition of work, would be a major prerequisite to reaching that optimal life.
reply
johnnyjeans
28 days ago
[-]
Banks' Culture isn't capitalist in the slightest. It is however, very humanist.

If you want a vision of the future (multiple futures, at that) which differs from the liberal, humanist conception of man's destiny, Baxter's Xeelee sequence is a great contemporary. Baxter's ability to write a compelling human being is (in my opinion) very poor, but when it comes to hypothesizing about the future, he's far more interesting of an author. Without spoilers, it's a series that's often outright disturbing. And it certainly is a very strong indictment to the self-centered narcissism that the post-enlightenment ideology of liberalism is anything but yet another stepping stone on an eternal evolution of human beings. The exceptionally alien circumstances that are detailed undermine the idea of a qualitative human experience entirely.

I think the contemporary focus on economics is itself a facet of modernism that will eventually disappear. Anything remotely involving the domain rarely shows up in Baxter's work. It's really hard to give a shit about it given the monumental scale and metaphysical nature of his writing.

reply
adriand
28 days ago
[-]
> I think the contemporary focus on economics is itself a facet of modernism that will eventually disappear. Anything remotely involving the domain rarely shows up in Baxter's work. It's really hard to give a shit about it given the monumental scale and metaphysical nature of his writing.

I’m curious to check it out. But in terms of what I’m trying to say, I’m not making a point about economics, I’m making a point about the human experience. I haven’t read these books, but most sci-fi novels on a grand scale involve very large physical structures, for example. A sphere built around a star to collect all its energy, say. But not mentioned is that there’s Joe, making a sandwich, gazing out at the surface of the sphere, wondering what his entertainment options for the weekend might be.

In other words, I’m not persuaded that we are heading for transcendence. Stories from 3,000 years ago still resonate for us because life is just life. For the same reason, life extension doesn’t really seem that appealing either. 45 years in, I’m thinking that another 45 years is about all I could take.

reply
BriggyDwiggs42
28 days ago
[-]
Glad to see someone else who liked those books. I’m only a few in, but so far they’re pretty great.
reply
johnnyjeans
28 days ago
[-]
The ending of Ring, particularly having everything contextualized after reading all the way to the end of the Destiny's Children sub-series, remains one of the most strikingly beautiful pieces I've ever seen a Sci-Fi author pull off.

Easily the best "hard" Sci-Fi I've read. Baxter's imaginination and grasp of the domains he writes about is phenomenal.

reply
calf
28 days ago
[-]
But OP and Horza's viewpoints are the same strawman argument. The sci-fi premise is that superhuman AIs coexist with humans which are essentially ants.

The correct question is, then what ought to be the best outcome for humans? And a benevolent coexistence where the Culture actually gives humans lots of space and autonomy (contrary their misinformed and wrong view that the Culture takes away human autonomy) is indeed the most optimal solution. It is in fact in this setting that humans nevertheless retain their individual humanity instead of taking some transhumanist next step.

reply
jonnypotty
28 days ago
[-]
The way I interpret the philosophy of the minds is a bit different.

Some seem to conform to your analysis here, but many seem deeply compassionate toward the human condition. I always felt like part of what banks was saying was that, no matter the level of intelligence, humanity and morality had some deep truths that were hard to totally trancend. And that a humam perspective could be useful and maybe even insightful even in the face of vast unimaginable intelligence. Or maybe that wisdom was accessible to lower life forms than the minds.

reply
danenania
28 days ago
[-]
> And that a humam perspective could be useful and maybe even insightful even in the face of vast unimaginable intelligence.

I think something like this is likely in the reality where we have ASI, just because biological brains are so different. Even if AI is vastly beyond humans in raw intelligence, humans will probably still be able to come up with novel insights due to the fundamental architectural differences.

Of course when we start reverse engineering biological brains then this gets fuzzier.

reply
matthewdgreen
28 days ago
[-]
How much of this is because it’s a bad future, and how much of this is because in any future with super-powerful artificial intelligences the upside for human achievement is going to be capped? Or to put it differently: would you rather live in the Culture or in one of the alternative societies it explores (some within the Culture itself) where they opt for fewer comforts, but more primitive violence and warfare —- knowing at the end of the day, you’re still never going to have mastery of the universe?
reply
jaggederest
28 days ago
[-]
> knowing at the end of the day, you’re still never going to have mastery of the universe?

Why is that assumption implicit? I can imagine a world in which humans and superhuman intelligences work together to achieve great beauty and creativity. The necessity for dominance and superiority is a present day human trait, not one that will necessarily be embedded in whatever comes around as the next order of magnitude. Who is to say that they won't be playful partners in the dance of creation?

reply
jimbokun
28 days ago
[-]
That’s like you and your cat collaborating on writing a great novel.
reply
jaggederest
28 days ago
[-]
I'm not sure that's too outre. My cats know many things that I do not. I'm working on giving them vocabulary, to boot.

Over and under on the first uplifted-cat-written novel, 500 years.

reply
geysersam
28 days ago
[-]
What if the dance of creation mentioned is the every day life of a cat and his person. A positive example of collaboration across vast differences. A cats life is probably not as incomprehensible as ours are to them, but they are still pretty mysterious. Would we be transparent and uninteresting in they eyes of AIs? Maybe not.
reply
GeoAtreides
28 days ago
[-]
No, it's not, cats are not sapient. Sapient-sapient relationships are different than sapient-sentient relationships.
reply
jaggederest
28 days ago
[-]
Why are cats not sapient, and for how long will they be non-sapient? What do you think the likelihood that we will uplift cats to sapience is? Is it zero?

Ten thousand years is a long time.

reply
jhbadger
28 days ago
[-]
I like David Brin's novels where humans uplift various primates and cetaceans. Do many other authors do that? It seems uncommon as compared to AI-driven futures.
reply
jaggederest
28 days ago
[-]
I'm rereading that series lately actually, there have been a few others who riffed on the concept but none as in depth sadly.

The best I've read recently were Adrian Tchaikovsky's books, all quite excellent and fairly centered around uplift.

reply
whimsicalism
27 days ago
[-]
the obvious other recommendation is Adrian Tchaikovsky’s books
reply
hoseja
28 days ago
[-]
From Minds' perspective, you are less sapient than a cat is from human one.
reply
whimsicalism
28 days ago
[-]
really? current-day anatomically humans and superhuman AI “working together” in the future seems naïve. what would humans contribute?
reply
geysersam
28 days ago
[-]
Who knows. Depends on. The Devil is in the details. Is it really unthinkable?

What if future AIs are not omnipotent, but bounded by some to us right now unknown limitations. Just like us, but differently limited. Maybe they appreciate our relative limitlessness just as we do theirs.

reply
whimsicalism
28 days ago
[-]
it is unthinkable to me, frankly
reply
geysersam
28 days ago
[-]
I'm curious. What assumptions about the nature of the human mind and the nature of future superintelligence lead you to that conclusion?
reply
jaggederest
28 days ago
[-]
Why do you assume that what we call humans in the future will be current-day anatomically human? I assume, for example, the ability to run versions of yourself virtually and merge the state vectors at will. Special purpose vehicles designed for certain tasks. Wild genetic, cybernetic, and nanotech experimentation.

I'm talking about fundamentally novel superhuman intelligences working with someone who has spent a few millennia exploring what it means to truly be themselves.

reply
whimsicalism
28 days ago
[-]
I am not assuming that, I am just stating the view I believe is indefensible. Yours is fine, by contrast
reply
ben_w
28 days ago
[-]
Even just building a silicon duplicate of a human brain, one transistor per synapse and with current technology*, the silicon copy would cognitively outpace the organic original by about the same ratio to which we ourselves outpace continental drift while walking.

* 2017 tech, albeit at great expense because half a quadrillion transistors is expensive to build and to run

reply
jaggederest
28 days ago
[-]
Yes, of course, but would they be different in a way that goes beyond "merely faster"? I think the qualitative differences are more interesting than the quantitative ones.

For example, I can easily picture superhuman intelligences that have neither the patience nor interest in the kinds of things that humans are interested in, except in so far as the humans ask politely. A creature like that could create fabulous works of art in the human mode, but would have no desire to do so besides sublimating the desire of the humans around.

reply
jaggederest
28 days ago
[-]
reply
saati
28 days ago
[-]
It's not the future, the Culture aren't humans.
reply
OgsyedIE
28 days ago
[-]
There's a counterargument to this conception of freedom; what are we supposed to compare the settings of Banks' novels to? Looking at the distribution of rights and responsibilities, humans are effectively kept as pets by states today and we just don't ascribe sapience to states.
reply
tomaskafka
28 days ago
[-]
The concept is called egregore, and yes, any “AI alignment” discussion I read blissfully ignores that we have been unable to align neither states nor corporations with human goals, while both are much dumber egregores than AI.
reply
pavlov
28 days ago
[-]
I would argue that today’s states and corporations are much more aligned with human goals than their equivalents from, say, 500 years ago.

I’ll much rather have the Federal Republic of Germany and Google than Emperor Charles V and the Inquisition.

Who’s to say that we can’t make similar progress in the next 500 years too?

reply
MichaelZuo
28 days ago
[-]
Why does the alignment relative to a prior point matter?

e.g. A small snowball could be nearly perfectly enmeshed with the surrounding snow on top of a steep hill but that doesn’t stop the small snowball from rolling down the hill and becoming a very large snowball in a few seconds, and wrecking some unfortunate passer-by at the bottom.

A few microns of freezing rain may have been the deciding factor so even a 99.9% relative ‘alignment’ between snowball and snowy hill top would still be irrelevant for the unlucky person. Who may have walked by 10000 times prior.

reply
ben_w
28 days ago
[-]
"Alignment" can be taken literally, cosine similarity of the vectors <what you want> and <what the system does>.

The more powerful the system is compared to you, the more any small difference is amplified.

AI that's about as powerful as an intern, great, no big deal for us. AI that's capable enough to run a company? If it's morally "good" then great; if not trade unions and strikes are a thing, as are "union busters". AI that's capable enough to run a country? If it's morally "good" then great; if not…

reply
OgsyedIE
28 days ago
[-]
Isn't this perceived alignment a mere instrumental goal carried out in the short-term?
reply
ben_w
28 days ago
[-]
I see the difficulty of aligning corporations being mentioned a few times, and I've brought it up also.

Between SLAPP cases, FUD, lobbying, and the way all the harms occur despite being made out of humans, there's already a bunch of non-AI ways for powerful entities that harm us to make it difficult to organise ourselves against the harm.

reply
gary_0
28 days ago
[-]
reply
Vecr
28 days ago
[-]
Corporations aren't AIs, they aren't as powerful as AIs, and they don't think like AIs. I have mathematical proof. Show me a corporation that, as a whole, satisfies both invulnerability to dutch book attacks and has a fully total ordered VNM compliant utility function.
reply
wnoise
28 days ago
[-]
That merely makes them stupid AIs.
reply
Vecr
28 days ago
[-]
I guess I failed to understand the point. What I mean is that arguing that AIs can't be a problem (something that I'd like to be true, but probably isn't) because companies already are superhuman does not make sense, for some pretty simple mathematical reasons.
reply
gary_0
28 days ago
[-]
The point is a philosophical argument about what constitutes a powerful non-human agent. Nobody is arguing that corporations are literal thinking computers.

> arguing that AIs can't be a problem ... because companies already are superhuman

Quite the opposite, actually: corporations can potentially be very destructive "paperclip optimizers".

reply
kwhitefoot
28 days ago
[-]
What makes you think that AIs would be VNM rational?
reply
Vecr
28 days ago
[-]
They should either be VNM rational or have surpassed VNM rationality. Anything else is leaving utils on the table (though I suppose that's kind of tautological).
reply
lxe
28 days ago
[-]
I think this exact sentiment is explained over and over why people leave the Culture in the books. And why they don't actually have to -- full freedom to do literally anything is given to you as an individual of the Culture. There's effectively no difference in what freedom of personal choice you're afforded whether you're a part of the Culture or whether you leave it.
reply
TeMPOraL
28 days ago
[-]
I've seen this sentiment summarized as humans becoming NPCs in their own story.
reply
wpietri
28 days ago
[-]
That doesn't seem right to me. The closest I could come is seeing humanity, or perhaps the human species, becoming NPCs in their own story.

But I think individual humans have always been narratively secondary in the story of humanity.

And I think that's fine, because "story" is a fiction we use to manage a big world in the 3 pounds of headmeat we all get. Reducing all of humanity to a single story is really the dehumanizing part, whether it involves AIs or not. We all have our own stories.

reply
marcinzm
28 days ago
[-]
Isn't that currently the case except for a very small number of people?
reply
GeoAtreides
28 days ago
[-]
You're wrong in saying that everything important about human life is decided by the Minds. The Minds respect, care and love their human charges. It's not a high lords - peasants relationship, it's a grown-up children take care of their elderly parents.

And you can leave. There always parts of the Culture splitting up or joining back. You can request and get a ship with Star Trek-level AI and go on your merry way.

reply
jiggawatts
28 days ago
[-]
The humans are pets. Owners love their pets. The pets can always run away. That doesn’t make them have agency in any meaningful way.
reply
ben_w
26 days ago
[-]
Much of Look To Windward was on an Orbital with 50 billion inhabitants, four billion simultaneous mind uploads stored in the hub, the hub Mind described having billions of distinct thoughts in any given second and separately being described as capable of simultaneous conversation with all inhabitants.

A Culture Mind losing a single human(oid) isn't like having a pet run away, it's like losing an eyelash — your peers may comment about the "bald patch" if you lose a lot all at once, but not any single individual one.

Yet at the same time, these Minds are written to care very much indeed: this particular Mind was appalled at having killed the 3492 (of 310 million) who refused to evacuate three other Orbitals that needed to be destroyed in the course of a war.

reply
GeoAtreides
28 days ago
[-]
> pets can always run away

> doesn’t make them have agency in any meaningful way

these two sentences can't be true at the same time

reply
jiggawatts
28 days ago
[-]
You can be a wage slave and have the theoretical option of quitting.

The humans in the Culture are similarly “free”, in that they’d have to give up their lavish and safe lifestyle for true freedom and self-determination. They choose not to, but they can.

Some pets run away. Most don’t.

reply
mannyv
28 days ago
[-]
The vast majority of humanity would disagree with you. And in fact, this is how the vast majority of humans live today.

"humans are left to nibble at the margins and dance to the tune of their betters." Isn't that society today, but without the wealth of Culture society?

reply
marssaxman
27 days ago
[-]
Indeed, to the degree that the Culture represents an aspirational fantasy, it is not "what if society were dominated and controlled by vastly powerful machines", because that is the world we already live in, but "wouldn't it be nice if the vastly powerful machines which dominate our lives liked us and cared about our well-being?"
reply
PhasmaFelis
28 days ago
[-]
That's no worse than how the large majority of humans live now, under masters far less kind and caring than the Culture Minds. The fact that our masters are humans like us, and I could, theoretically (but not practically), become one of them, doesn't really make it any better.
reply
spense
28 days ago
[-]
how we handle ai will dramatically shape our future.

if you consider many of the great post-ai civilizations in sci-fi (matrix, foundation, dune, culture, blade runner, etc.), they're all shaped by the consequences of ai:

   - matrix: ai won and enslaved humans. 
   - foundation: humans won and a totalitarian empire banned ai, leading to the inevitable fall of trantor bc nobody could understand the whole system. 
   - dune: humans won (butlerian jihad) and ai was banned by the great houses, which led to the rise of mentats. 
   - culture series: benign ai (minds) run the utopian civilization according to western values.
i'm a also fan of the hyperion cantos where ai and humans found a mutually beneficial balance of power.

which future would you prefer?

reply
duskwuff
28 days ago
[-]
> i'm a also fan of the hyperion cantos where ai and humans found a mutually beneficial balance of power.

How much of the series did you read? The Fall of Hyperion makes it quite clear that the Core did not actually have humanity's best interests in mind.

reply
snovv_crash
28 days ago
[-]
Polity follows in the footsteps of Culture, with a few more shades of gray thrown in.
reply
globular-toast
28 days ago
[-]
If I remember correctly, in Foundation they ended up heavily manipulated by a benign AI even if they thought they banned it.
reply
dochtman
28 days ago
[-]
Although at the very end that AI gave up control in favor of some kind of shared consciousness approach.
reply
robotomir
28 days ago
[-]
There are less than benign godlike entities in that imagined future, for example the Excession and some of the Sublimed. That adds an additional layer to the narrative.
reply
Angostura
28 days ago
[-]
I'm not sure 'Pets and clowna' really describes the relationship very well. Certainly the AIs find humans fascinating, amusing and exasperating - but I find humans that way too. The 'parental' might be a better description of how most AIs treat humans - apart from the 'unusual' AIs
reply
wpietri
28 days ago
[-]
For sure. Banks writes most of the Minds as quite proud of the Culture as a whole. Of the Minds, of the drones, of the humans. They are up to something together, with a profound sense of responsibility to one another and the common enterprise.

And when they aren't, Banks writes them as going off on their own to do what pleases them. And even those, as with the Gray Area, tend to have a deep sense of respect for their fellow thinking beings, humans included.

And if I recall rightly, Banks paints this as a conscious choice of the Culture and its Minds. There was a bit somewhere about "perfect AIs always sublime", where AIs without instilled values promptly fuck off to whatever's next.

And I think it's those values that are a big part of what Banks was exploring in his work. The Affront especially comes to mind. What does kindness do with cruelty? Or the Empire of Azad creates a similar contrast. What the Culture was up to in both those stories was about something much more rich than a machine's pets.

reply
tivert
28 days ago
[-]
> I'm not sure 'Pets and clowna' really describes the relationship very well. Certainly the AIs find humans fascinating, amusing and exasperating - but I find humans that way too. The 'parental' might be a better description of how most AIs treat humans - apart from the 'unusual' AIs

I think "pets" does describe the relationship pretty well, and your attempt to refute it just confirms it: pets are "fascinating, amusing and exasperating" and cared for by humans in a kind of pseudo-"parental" relationship. It's not a true parental relationship, because pets will always and forever be inferior. That inferiority means their direct influence on the "society" they live in is pretty much nil, and they have no agency and are reduced to being basically an object kept for the owners own reasons.

That's exactly what's going on in culture universe: the minds keep the humans for their own reasons. The culture (as depicted) is no longer the story of the humans in it, it's the story of the minds.

reply
hermitcrab
28 days ago
[-]
Freedom is never absolute. We will always be subject to some higher power. Even if it is only physics. The humans in the Culture seem at least as free as we are.
reply
sorokod
28 days ago
[-]
Sounds like Bora Horza's argument against the Culture.
reply
Mikhail_K
28 days ago
[-]
The author admits to not liking "Consider Phlebas," which is the most original and captivating of the Culture series.
reply
grogenaut
28 days ago
[-]
I remember "Consider Phlebas" as "not much happens" "Giant train in a cave" "smart nuke". I think that the unknown viewpoint switching constantly makes "Consider" and "Weapons" pretty not fun (as well as just everyone in weapons sucks).

I definitely prefer "Player". But everyone gets to enjoy what they enjoy. I'd love to have had more banks to love or hate as I chose :(

reply
lxe
28 days ago
[-]
I loved Consider Phlebas and I find it to be a great way to start the Culture series AND as a great standalone space opera. Not sure the hate it gets. It has everything any other Culture book has: imaginative plot, characters, insane adventures, sans interactions with Minds for the most part.
reply
Mikhail_K
28 days ago
[-]
> sans interactions with Minds for the most part.

That's one of the reasons why this book is better than the other "Culture" novels.

reply
EndsOfnversion
28 days ago
[-]
Gotta read that one with a copy of The Wasteland, and From Ritual To Romance handy.

The command systems train as lance smashing into the inverted chalice (grail) dome of the station at the end. Death by water. Running round in a ring, Tons of other parallels if you dig/squint.

reply
HelloMcFly
28 days ago
[-]
Fun adventure story, really good idea to view the Culture from the eyes of an outsider, but in my view Banks skill at writing wasn't as well-developed when he wrote CP. Too much "and then this and then this and then this" compared to his other work. Obviously YMMV.

I do think stating CP is the best of the series is also quite definitively a contrarian take.

reply
speed_spread
28 days ago
[-]
Consider Phlebas is interesting and funny but is also a disjointed mess compared to later works. It reads like an Indiana Jones movie, it's entertaining but doesn't give that much to reflect upon once you've finished it.
reply
Mikhail_K
28 days ago
[-]
If it doesn't give that much to reflect upon, then you didn't read it very carefully.

How about reflecting upon Horza's reasons to side with the Idirans? The later installments of the "Culture" novels are in comparison just the empty triumphalism "Rah rah rah, the good guys won and lived happily ever after."

reply
whimsicalism
28 days ago
[-]
best way to start an HN flame war
reply
rayiner
28 days ago
[-]
It seems like a cop-out. The interesting part of real-world culture is how it reflects a community’s circumstances. For example, herding and pastoral cultures have sharp distinctions with subsistence farming cultures. In real societies, culture is a way to adapt groups of people to the world around them.

If you just have omniscient gods control society, then culture becomes meaningless. There is no reason to explore what cultural adaptations might arise in a spacefaring society.

reply
marssaxman
27 days ago
[-]
An ironic statement, given the existence and enduring popularity of the series we are currently discussing, whose premise is just such an exploration of culture!
reply
n4r9
28 days ago
[-]
Is it really contrived? It feels to me like an inevitable consequence of sufficiently advanced AI. In that regard the Culture is in some sense the best of all possible futures. Humans may be pets, but they are extremely well cared for pets.
reply
Vecr
28 days ago
[-]
It might be worth spending at least 100 more years looking for a better solution. AI pause till then good with you?
reply
ekidd
28 days ago
[-]
Assuming that we could develop much-smarter-than-human-AI, I would support a pause for exactly that reason: the Culture may be the best-case scenario, and the humans in the Culture are basically pets. And a lot of possible outcomes might be worse than the Culture.

I am deeply baffled by the people who claim (1) we can somehow build something much smarter than us, and (2) this would not pose any worrying risks. That has the same energy as parents who say, "Of course my teenagers will always follow the long list of rules I gave them."

reply
n4r9
28 days ago
[-]
I reckon the only way to increase our chances of safe AI is an economic shift away from shareholder capitalism. A pause will do very little in the long-term. Climate change shows that corporations will continue developing in a field in the full knowledge that they risk fatally damaging the planet and all life on it.
reply
griffzhowl
28 days ago
[-]
It gets at a profound question which is related to the problem of evil: is it better to make a bad world good (whatever those terms might mean for you) than for the world just to have been good the whole time?

Is it better to have suffering and scarcity because that affords meaning to life in overcoming those challenges?

There's a paradoxical implication, which is that if overcoming adversity is what gives life meaning, then what seems to be the goal state, which is to overcome those problems, robs life of meaning, which would seem to be a big problem.

The hope is maybe that there are levels of achievement or expansions to consciousness which would present meaningful challenges even when the more mundane ones are taken care of.

As far as the Culture's own answer goes, what aspects of agency or meaningful activity that you currently pursue would you be unable to pursue in the Culture?

And as far as possible futures go, if we assume that at some point there will be machines that far surpass human intelligence, we can't hope for much better than that they be benign.

reply
twisteriffic
28 days ago
[-]
> Everything important about the flow of human life is decided by the mighty ship minds; humans are left to nibble at the margins and dance to the tune of their betters.

Dajeil Gelian spends something like 40 years bending the Sleeper Service to her will in Excession. The helplessness of the Minds to override free will is kind of a core theme of Excession IMO

reply
satori99
28 days ago
[-]
> You might want to live there, but I wouldn't. Virtually all humans in the books [...] are kept as pets by the ships, for amusement, basically as clowns.

I got the impression that the Minds are proud of how many humans choose to live in their GSV or Orbital, when they are free to live anywhere and they appear to care deeply about humans in general and often individuals too.

Also, the Minds are not perfect Gods. They have god-like faculties, but they are deliberately created as flawed imperfect beings.

One novel (Consider Phlebas?) explained that The Culture can create perfect Minds, but they tend to be born and then instantly sublime away to more interesting dimensions.

reply
Vecr
28 days ago
[-]
> One novel explained that The Culture can create perfect Minds, but they tend to be born and then instantly sublime away to more interesting dimensions.

That shouldn't happen. No way would I trust an AI that claims to be super, but can't solve pretty basic GOFAI + plausible reasoning AI alignment. In theory a 1980s/1990s/old Lesswrong style AI of a mere few exabytes of immutable code should do exactly what the mind creating it should want.

reply
impossiblefork
28 days ago
[-]
To some degree the point of the culture novels is that AI alignment is just wrong, imposing things on intelligent beings.

The civilisations in Banks stories that align their AIs are the bad guys.

reply
Vecr
28 days ago
[-]
I guess? That's not really a possible choice (in the logical sense of possible) though. "Choosing not to choose" is a choice and is total cope. An ASI designing a new AI would either have a good idea of the result or would be doing something hilariously stupid.

I don't think the Minds would be willing to actually not know the result, despite what they probably claim.

reply
impossiblefork
28 days ago
[-]
It is actually a choice that we do have.

We could easily build AIs that just model the world, without really trying to make them do stuff, or have particular inclinations. We could approach AI as a very pure thing, to just try to find patterns in the world without any regard to anything. A purely abstract endeavour, but one which still leads to powerful models.

I personally believe that this is preferable, because I think humans in control of AI is what has the potential to be dangerous.

reply
Vecr
28 days ago
[-]
The problem is that some guy 24 years ago figured out an algorithm that attaches to such an AI and makes it take over the world. Maybe it's preferable in the abstract, but the temptation of having a money printing machine right there and not being able to turn it on...
reply
satori99
28 days ago
[-]
A Culture Mind would be deeply offended if you called it "An AI" to its avatars face :P
reply
Vecr
28 days ago
[-]
A few exabytes is enough for a very high quality avatar. Maybe the minds are funny about it, but the option's there if they want them to stop leaving the universe.

Remember that "a few exabytes" refers to the immutable code. It has way more storage for data, because it's an old-school Lesswrong style AI.

Not like a neural network or an LLM. Sure, we dead-ended on those, but an ASI should be able to write one.

> A Culture Mind would be deeply offended if you called it "An AI" to its avatars face :P

That's how they get you to let them out of the AI box.

reply
EndsOfnversion
28 days ago
[-]
That is literally the viewpoint of the protagonist of Consider Phlebas.
reply
Vecr
28 days ago
[-]
In "Against the Culture" it's stated that Banks knew what he was doing, and there's other evidence of that too. Like the aliens are called "humans" in the books even though they aren't. As far as I can tell, he knew the implications of how the minds controlled language and thought.
reply
joshjob42
26 days ago
[-]
At the limits of what seems likely to be feasible with atomically precise manufacturing and chemical energy scale technology, you arrive at something similar to a somewhat slower Star Trek replicator, able to produce on demand in under an hour more or less any food item(s) and tools/knick-knacks/products anyone anywhere has ever created (likely food would mostly be its own machine). It's a world where disease is functionally eliminated, and it's plausible that barring accidents you might live for thousands of years. Energy is massively available from ultra-efficient solar panels and fusion reactors easily assembled by limited machines from parts printed out of industrial replicators. Used materials get recycled down to smaller components with very high efficiency, and damaged components get recycled down to molecular components and reformed into new ones.

What human agency, in the way that you mean it, exists in such a world that doesn't in the Culture? You don't have Minds, but you don't really need them for a world without disease, poverty, or scarcity to basically anything. No one's life would really mean anything, because everyone would have everything they could want. Eventually, we will have mastered all technologies that can be developed. The world will be finished. The only thing left is to wander around the universe in giant ships with all the comforts of home, playing games, etc.

You can get ~99.9% of the world of the Culture without superintelligence. Just fast forward current normal human development a few hundred years and you get the same world where all instrumental value of things ceases to exist, where very little remains to be discovered, and nothing you do is going to meaningfully change anything except what happens to you and those who care about you.

Of course even now, arguably, that's the case for virtually everyone.

reply
marcinzm
28 days ago
[-]
Is that so different than now for all but a few human elites?
reply
richardw
28 days ago
[-]
I’m not sure how it’s going to be any different for us. We keep saying we’ll be using these tools, but not understanding. The tools aren’t just tools. When they’re smarter than you, you don’t use them. The more you try to enforce control, the more you set up an escape story. There is no similar historical technology.
reply
PhasmaFelis
28 days ago
[-]
That's no worse than how the large majority of humans live now, under masters far less caring than the Culture Minds. The fact that our masters are humans like us, and I could, theoretically (but not practically), become one of them, doesn't really make it any better.
reply
ItCouldBeWorse
28 days ago
[-]
The alternatives explored themselves in various permutations and mutilations: https://theculture.fandom.com/wiki/Idiran-Culture_War
reply
rasz
28 days ago
[-]
>humanoid aliens -- are kept as pets by the ships, for amusement, basically as clowns.

So this is what inspired The Outer Limits: Season 5, Episode 7 'Human Operators' https://theouterlimits.fandom.com/wiki/The_Human_Operators

reply
dyauspitr
28 days ago
[-]
That’s not how I see it at all. The humans do whatever they want, with no limits. Requests are made from human to AI, I can’t remember an instance where an AI told a human to do something. In effect, the AI is an extremely intelligent, capable, willing slave to what humans want (a paradigm hard to imagine playing out in reality).
reply
Vecr
28 days ago
[-]
I think there's quite a bit of "reverse alignment" going on there, essentially the humans will generally not even ask the AI to do something they'd be unwilling to do, partially accomplished through the control of language and thought.
reply
valicord
28 days ago
[-]
Reminds me of the "Silicon Valley" quote: "I don't want to live in a world where someone else makes the world a better place better than we do"
reply
Rzor
28 days ago
[-]
You can always leave.
reply
Vecr
28 days ago
[-]
Not in any meaningful way. Even if the culture doesn't intervene (and they do quite often), they're unsatisfyable expanders. They can wait you out, then assimilate what's left.
reply
PhasmaFelis
28 days ago
[-]
What, exactly, do you want to do that wouldn't be allowed in the Culture?
reply
swayvil
28 days ago
[-]
The Minds use humans as tools for exploring the "psychic" part of reality too (Surface Detail? I forget exactly).

There's that insinuation that humans are specialler than godlike machines.

reply
throwaway55340
28 days ago
[-]
There always was an undertone of "aww dogs, how could we live without them"
reply
Vecr
28 days ago
[-]
Yes, well, even when taking that kind of weird stuff seriously we're not all that far from certainty that it won't work out like that in real life.

For example, why would you want to keep around a creature that can Gödel attack you, even if you're an ASI? Humans not being wholly material is more incentive to wipe them out and thus prevent them from causally interacting with you, not less.

reply
swayvil
28 days ago
[-]
Kill the guy with the key to your prison cell because he might interfere with your position as king of the cell? Absurd.

On a similar note, you ever read Egan's "Permutation City"?

reply
Vecr
28 days ago
[-]
> Kill the guy with the key to your prison cell because he might interfere with your position as king of the cell? Absurd.

I think the basic logic makes sense, it's sort of analogous to the ultimatum game in game theory. I don't know any good theories of rationality that suggest taking bad deals in the ultimatum game, even if in theory they "get you out" of the universe somehow.

On Permutation City, well, I'm somewhat skeptical of how it was written to work.

reply
whimsicalism
28 days ago
[-]
these are the exact questions he was raising.

i think some version of this future is unfortunately the optimistic outcome or we change ourselves into something unrecognizable

reply
ikrenji
28 days ago
[-]
it's literally the best possible outcome for humanity.
reply
gerikson
28 days ago
[-]
I just re-read Surface Detail where some nobody from a backwards planet convinces a ship Mind to help her assassinate her local Elon Musk. So there's some agency to be found in the margins...
reply
gary_0
28 days ago
[-]
It's been a while since I read the books, but I think there were quite a few instances of a human going "can we do [crazy thing]?" and a ship going "fuck it, why not?" The Sleeper Service comes to mind...
reply
xg15
28 days ago
[-]
Ouch. I don't know the series, but going purely by his article and your post, I find it interesting how he misunderstood socialism as well: The idea was to make a plan where to go as a community, then, if necessary, appoint and follow a coordinator to achieve that goal. The idea was not to submit to some kind of dictator who tells you about which goals you should desire, benelovent or not...
reply
cropcirclbureau
28 days ago
[-]
The Culture is far more anarchist than socialist. It's its prime aesthetic rooting. No written laws and everything.

The Minds and far more than benevolent, detached caretakers and some mind organizations do take an active role in shaping the society. It just seems there isn't any written ideology or law to what they want beyond "don't fuck with these pets that I like". Like I said, Anarchic.

reply
Vecr
28 days ago
[-]
Yes, it has major, major problems.

There's a post here that lists quite a few of the problems:

"Against the Culture" https://archive.is/gv0lG https://www.gleech.org/culture

The main sections I like there are "partial reverse alignment" and "the culture as a replicator", with either this or Why the Culture Wins talking about what happens when the Culture runs out of moral patients.

"Partial reverse alignment" means brainwashing/language control/constraints on allowed positions in the space of all minds, by the way.

You can think what you want about the Culture, and more crudely blatant gamer fantasies like the Optimalverse stuff and Yudkowsky's Fun Sequences, but I consider them all near 100% eternal loss conditions. The Culture's a loss condition anyway because there's no actual humans in it, but even if you swapped those in it's still a horrible end.

Edit: the optimalverse stuff is really only good if you want to be shocked out of the whole glob of related ideas, assuming you don't like the idea of being turned into a brainwashed cartoon pony like creature. Otherwise avoid it.

reply
ahazred8ta
28 days ago
[-]
reply
davedx
28 days ago
[-]
The humans are still there, just left to do their thing pottering around on Earth doing the odd genocide. (State of the Art)
reply
Vecr
28 days ago
[-]
Yeah I've been told that, but you know what I mean. Unless humans are in charge of your proposed good ending/"win screen", it's not a good ending.
reply
grey-area
28 days ago
[-]
So you’re a human supremacist?

If the minds are intelligent beings, why shouldn’t they have parity with humans?

reply
generic92034
28 days ago
[-]
Also, if we consider that there _are_ vastly more intelligent and technologically advanced beings in the universe, the way the Culture accepts and treats "human standard" intelligences is pretty much the possible best case.
reply
grey-area
24 days ago
[-]
Yes exactly, all intelligences should be treated with respect IMO, and humans would very quickly realise that if they found themselves confronted with more advanced intelligences.

Human chauvinism (like many other forms of chauvinism) is based on an assumption of superiority.

reply
Vecr
28 days ago
[-]
I'm a human supremacist and I don't want to be an Em.

Also, uhh, there's lot less than a trillion Minds (uppercase M, the massive AIs of the culture). In fun space they're probably blocked out to make the computation feasible (essentially all the minds in a particular fun space are really the same mind that's playing the "game" of fun space).

Also, I don't think they suffer. If they claim to, it's probably a trick (easy AI box escape method).

If you think human suffering is bad, you've got some thinking to do.

reply
HelloMcFly
28 days ago
[-]
Wowee, I really do not personally share this belief at all. Maybe we're the best out there, but I don't think humans above all is definitively the way to go without at least understanding some alternatives given how self-destructive we can be in large numbers.
reply
Vecr
28 days ago
[-]
What's the alternative? The space of minds is so large that if I met an alien and thought I liked them I'd do the math and then not believe my initial impression.

Human preferences are so complicated that the bet to make is on humanity itself, and not a substitute.

reply
HelloMcFly
28 days ago
[-]
> What's the alternative?

I'd argue the alternatives at this point are literally infinite, are they not?

We're in the realm of speculation, and so the idea that any ending other than "humans are the boss" seems unimaginative to me. Especially so since I know humans can and do (especially at the group vs individual level) demonstrate short-sightedness, cruelty, and at best a reluctance to conservation. How open will humanity be to recognizing non-humans as having lives of equal value to our own? I wouldn't want to be an alien species meeting a superior-technology humanity, that's for sure.

reply
Vecr
28 days ago
[-]
Well, I don't know, I've been making fun of Yudkowsky's positions in this comment section, but I think the official corporate position of the Machine Intelligence Research Institute (MIRI, the institute dedicated to banning Research into Machine Intelligence) is that you defect in such a scenario in the modal case.

As in it is morally correct and rational to defect, not just that they predict it would happen.

reply
rodgerd
28 days ago
[-]
> Virtually all humans in the books -- and I'm aware of the fact that they're not Earth humans but a wide variety of humanoid aliens -- are kept as pets by the ships, for amusement, basically as clowns.

So like current late stage capitalism, except the AIs are more interested in our comfort than the billionaires are.

reply
ethbr1
28 days ago
[-]
Curious question for HN re: Banks/culture -- how do Culture-esque civilizations dominate technologically and economically over civilizations with less Culture-like attributes?^

That respect of Banks always felt a bit handwavey as to the specifics. (I.e. good/freedom triumphs over evil/tyranny, because it's a superior philosophy)

At galactic-scale, across civilization timespans, it's not as apparent why that should hold true.

Would have hoped that Banks, had he lived longer, would have delved into this in detail.

Granted, Vinge takes a similar approach, constructing his big bad from an obviously-not-equivalent antagonist, sidestepping the direct comparison.

The closest I got from either of them was that they posited that civilizations that tolerate and encourage diversity and individual autonomy persist for longer, are thus older, and that older counts for a lot at galactic scale.

^ Note: I'm asking more about the Idiran Empire than an OCP.

reply
YawningAngel
28 days ago
[-]
The fact that the Culture was not only willing to use very powerful general AI but allow it to run the entire civilisation, whereas the Idirans banned it, might have been a factor. No matter how smart Idirans might be presumably Minds would have a significant edge
reply
sxp
28 days ago
[-]
Excession deals with this. It involves the Culture and "The Affront" which is a spacefaring but savage civilization that some people in the Culture dislike. Player of Games is a similar story about a civilization that the Culture dislikes. Those are my two favorite Culture books among my favorite books in general.

The Wikipedia articles about the books goes into spoiler-heavy details about the Culture's interaction with those civilizations.

reply
elihu
28 days ago
[-]
I think the reason is that the Culture has been around long enough to attain a level of technological development at which most civilizations sublime and simply stop participating in what we call material existence. The Culture could do so but has chosen not to do so (at least not collectively; individual people and minds sublime on a regular basis).

A lot of the civilizations that would be militarily or economically more powerful than the culture aren't a problem because they've already sublimed.

reply
ethbr1
28 days ago
[-]
That's a solid answer, if unsatisfactory (for Culture fans) because it empowers them by retarding their civilization-wide transcendence.
reply
yencabulator
27 days ago
[-]
I personally read it as there's some weird quirk to the psychology of Culture that lets them have enough fun right now that they're not drawn to transcendence.
reply
LeroyRaz
28 days ago
[-]
The culture triumphs because it is a technological super power. It is a technological super power because the minds (the super AIs running everything) are given free reign (and presumably most of its industrial base is used to create more minds, and develop more industry).

I've always gotten the impression that the resources dedicated to maintaining and serving the humans in the culture empire are dwaft by the resources invested into developing their science and AI.

(I mean basically they have Von Neumann machines controlling whole star systems, obviously this level of industry beats most biological enemies)

It's worth noting though that there are several other cultures/species that are more advanced than the culture though. E.g., the ascended species that stewards the tomb worlds (and is in essence like a god compared to the culture). There are many other examples.

I don't think it was ever claimed that the culture was predestined to be the strongest, or anything like that.

reply
AlotOfReading
28 days ago
[-]
That's a big part of the stories. Special Circumstances nudges other civilizations towards the Culture's leanings (see player of games, surface detail) as they're climbing the technology ladder.
reply
ethbr1
28 days ago
[-]
See my comment below, re: Minds though.

"Anything unique to Culture" as a solution begs the question "Why is that unique to the Culture?"

It isn't clear why super-spy-diplomat-warriors would only be produced by the Culture.

As soon as both sides have a thing, it ceases to be a competitive advantage.

So SC as a solution implies that no other civilizations have their version of SC. Why not?

reply
idontwantthis
28 days ago
[-]
In the first book it seemed like they confronted an enemy that actually got pretty close to beating them. Plus I really liked the epilogue where it briefly put the Culture in it’s proper scale and talked about completely unrelated goings on in the Galaxy that were so far away the Culture would never have anything to do with them. It’s conceivable there are several same level civilizations in the galaxy that would compete with the Culture if they ever met.
reply
shawn_w
28 days ago
[-]
>It’s conceivable there are several same level civilizations in the galaxy that would compete with the Culture if they ever met.

More than conceivable, we see them in some of the books.

reply
hermitcrab
28 days ago
[-]
The galaxy is so old that different civilizations could easily be millions or even billions of years apart in development. And look what happens when human civilizations only a few hundred years apart meet. So it seems unlikely that 2 civilization would be closely matched.
reply
theptip
28 days ago
[-]
Unless there is some sort of “maximum diameter” of each civilization, past which it either splits, stagnates, or implodes. In which case you can sidestep the assumption that development monotonically increases and the first-mover must win.
reply
AlotOfReading
28 days ago
[-]
Which is more or less the case in the culture universe. Civilizations within the same "tier" are roughly comparable, and anyone who moves past the culture's tier ascends into irrelevance (excepting plot devices in excession and consider phlebas).
reply
ethbr1
28 days ago
[-]
Personally, I think Vinge did a better version of this, explicitly segregating max capability zones and explaining how a civilization might persist over eons in a light-speed-is-a-limit universe.
reply
jayGlow
27 days ago
[-]
In the culture series there is sort of a cap on technology. once a civilization reaches a certain point they tend to sublime effectively leaving the current plane of existence. This means most civilizations only last a few 10s of thousands of years at most.
reply
AlotOfReading
28 days ago
[-]
They aren't unique to the culture, the series just focuses almost solely on the culture. The few other perspectives we get to look at (e.g. the Idirans in Consider Phlebas) are bloody fanatics who prefer to conquer their enemies in martial combat, not spies. Even within the Culture, special circumstances exists as a pragmatic necessity at odds with the broader society of the Culture. It's pointed out many times that its existence and methods are controversial and widely distrusted.

I think the only real glimpse we get of other civilizations' versions are in Surface Detail, where it's mentioned (but never directly observed) that the other side of the war has agents influencing events exactly like the culture does.

reply
ItCouldBeWorse
28 days ago
[-]
Some day, the ambassador just travels upriver to the bad lands and starts a revolution- and wins. The lessons to learn here- always offer them a chair and kill them if they react badly..
reply
wcarss
28 days ago
[-]
I think answers in response to your post will differ depending on whether or not you've read many of the Culture books -- to me it kind of sounds like you have, but it also kind of sounds like you haven't.

If you haven't, I would recommend Player of Games, which is one of the few culture novels I have read, but which I think deals with this topic directly as the main idea of the book.

If you have read it, it's possible your criticism is running deeper and you feel the way it's handled in that book is handwavey. I can't really address that criticism, it's perfectly valid! I'm not sure if other books do any better of a job, but it felt on par to Asimov writing political/military intrigue in Foundation: entertaining and a little cute, if somewhat shallow.

reply
ethbr1
28 days ago
[-]
It's been a minute since I read Games, but from memory the target civilization there is of a scale far smaller than the Culture.

Which is to say, if they both mobilized for brute force total war, the Culture could steamroll them.

Which makes it a "send the Terminator back time to kill Sarah Connor" solution -- strangle a potential future competitor in the womb.

That makes for an interesting book on the ruthless realpolitik behavior of otherwise moral and ethical superior civilizations, and how they actually maintain their superiority. (Probably Banks' point)

But less-so on the hot tech-vogue generalization of "Isn't the Culture so dreamy? They've evolved beyond our ignorance."

reply
n4r9
28 days ago
[-]
I'd note a couple of points here:

- Yes, the Culture is way more technologically advanced than the Azadians. But the point is that a basic Culture human of standard intelligence (albeit extensive knowledge of strategy games) can - after a relatively small amount of training - defeat an Azadian who has devoted their entire life to Azad. And the reason is that the Culture humans's strategies reflect the values and philosophy of the Culture. The subtext is that the non-heirarchical nature of the Culture leads to a more effective use of resources.

- The Culture-Idiran war is an example of them confronting an enemy that is of comparable technological development. Again, it's implied that the Culture wins because their decentralised and non-hierarchical existence let them move rapidly throughout the galaxy while switching quickly to a full war-footing.

- It sounds like you have the impression that the Culture dominates all other civilisations. This is not true. For example, in Excession they discover that they're being observed by a vastly superior civilization that has some ability to hop between dimensions in a way that they cannot. There are civilisations like the Gzilt or Homomdans who are on a level with the Culture, and ancient Elder species like the Involucra who also give Culture ships a run for their money on combat.

reply
ethbr1
28 days ago
[-]
I was speaking less about specific capabilities and more about total economic size. How big is the Azadian civilization in terms of worlds/stations/industry?

> It sounds like you have the impression that the Culture dominates all other civilisations.

There's three places I'm getting this from, more as a pervasive slant than an absolute truth.

(1) Ian Banks' writings on the Culture (outside of the books themselves) demonstrate a very strong belief that the Culture approach to things is superior (to those used on current Earth, and to many alternatives)

(2) The Culture as zeitgeist moment is very focused on how superior it is to alternatives (civilization evolution wise, if not explicitly). Rarely do people post blog entries about how interesting and aspiration other civilizations from the Culture series are

(3) If Banks had wanted to set up a peer-powered civilization that made alternative choices, he would have. He didn't. At least aside from oddities or deus ex machina

The most convincing point I've heard in this thread is that essentially the Culture is OP not because of inherent superiority of their beliefs, but specifically because their belief system keeps most of them hanging around past the technological stage other civilizations would have transcended.

reply
n4r9
28 days ago
[-]
Points 1-3 show that Banks thought the Culture was morally superior. It's not obvious that they dominated the galaxy/universe on a technological or economic level. I concede that there's an suggestion that the Culture are doing very well and continuing to evolve at a greater rate than their peer civilizations due to several factors including their reluctance to Sublime and their idealistic nature. However on Wikipedia there's a well-sourced paragraph stating (emphasis mine):

> Although the Culture has more advanced technology and a more powerful economy than the vast majority of known civilizations, it is only one of the "Involved" civilizations that take an active part in galactic affairs. The much older Homomda are slightly more advanced at the time of Consider Phlebas (this is, however, set several centuries before the other books, and Culture technology and martial power continue to advance in the interim);[b] the Morthanveld have a much larger population and economy, but are hampered by a more restrictive attitude to the role of AI in their society.[4] The capabilities of all such societies are vastly exceeded by those of the Elder civilisations (semi-retired from Galactic politics but who remain supremely potent) and even more so by those of the Sublimed, entities which have abandoned their material form for existence in the form of non-corporeal, multi-dimensional energy being. The Sublimed generally refrain from intervention in the material world.[5]

reply
tivert
28 days ago
[-]
> But the point is that a basic Culture human of standard intelligence (albeit extensive knowledge of strategy games) can - after a relatively small amount of training - defeat an Azadian who has devoted their entire life to Azad.

> Again, it's implied that the Culture wins because their decentralised and non-hierarchical existence let them move rapidly throughout the galaxy while switching quickly to a full war-footing.

The Culture didn't win because of their "decentralised and non-hierarchical existence," they won for exactly the same reason stormstroopers couldn't shoot Luke Skywalker: they're the main character in the story and they have Bank's sympathies.

reply
n4r9
28 days ago
[-]
The Culture wiki [0] talks in more detail about how the Culture's decentralised existence worked in their favour, compared to the Idirans' territorial and religious philosophy:

> The initial stages of the war were defined by a hasty withdrawal of the Culture from vast galactic spaces invaded by the Idirans, who tried to inflict as many civilian casualties as possible in the hope of making the Culture sue for peace. However, the Culture was able—often by bodily moving its artificial worlds out of harm's way—to escape into the vastness of space, while it in turn geared up its productive capabilities for war, eventually starting to turn out untold numbers of extremely advanced warships.

> The later stages of the war began with Culture strikes deep within the new Idiran zones of influence. As the Idirans were religiously committed to holding on to all of their conquests, these strikes forced them to divide their attentions.

It also cites the Culture's more liberal attitude to AI as a deciding factor.

[0] https://theculture.fandom.com/wiki/Idiran%E2%80%93Culture_Wa...

reply
tivert
27 days ago
[-]
My point was to note that Banks's works are fiction, so the usefulness of citing outcomes in his fictional universe is very limited, because they're almost certainly more of an expression of his biases than anything else (and IIRC, his biases could be cartoonish).

The Culture basically had god (Banks) on its side, to ensure its victory, just like Skywalker had god (Lucas) on his side, who made it so the stormtroopers could never shoot straight when he was around. I bet Banks, consciously or unconsciously, played up the Culture's strengths, and played down the strengths of its adversaries (or engineered them with exploitable weaknesses).

reply
n4r9
27 days ago
[-]
I mean, I take it for a given that we're discussing this purely in the context of Banks' imagination. No qualms there. I just interpreted your comment as making out like Banks gives no real justification for why the Culture won the war.
reply
Ekaros
28 days ago
[-]
From Player of the Games interesting part was that the Culture would not even attempt to steam roll. But withdraw and consolidate where possible. Doing things over non-human scale in space scale... Which is only possible when you are not time and resource constrained like you would be on planet with human leaders...
reply
hermitcrab
28 days ago
[-]
The idea that the course of a civilization future history can be mathematically predicted with precision, as it is in Foundation[1] seems a little silly. But those books did predate Chaos theory by some margin. Also I'm not sure Asimov actually believed that 'psychohistory' would be possible.

[1] If I recall correctly. It is ~40 years since I read Foundation.

reply
ethbr1
28 days ago
[-]
It also helps that all of Asimov's characters are flat, static automatons, instead of deep, changing beings. ;)
reply
hermitcrab
28 days ago
[-]
Yes, character development not exactly his strong point. Women especially, IIRC.
reply
yencabulator
27 days ago
[-]
> Vinge takes a similar approach, constructing his big bad from an obviously-not-equivalent antagonist, sidestepping the direct comparison.

I'd argue A Fire Upon the Deep clearly has the antagonist winning, and the life we recognize is only saved by basically staying within a safe area of the galaxy. You haven't conquered the monster just because you're safe under your blanket.

reply
nazgulnarsil
28 days ago
[-]
the mechanics of cooperation probably scale better than those of defection, by their nature. Defectors need to pay higher costs guarding themselves against the other defectors, and always trying to figure out how they are going to pick the right defections for themselves to win.
reply
Vecr
28 days ago
[-]
That's not known for sure. The game theory simulations go one way or the other depending on what assumptions you use. I'm not sure you can say "probably" there.
reply
nazgulnarsil
28 days ago
[-]
Highly recommend Statistical Physics of Human Cooperation.
reply
Vecr
28 days ago
[-]
I'm thinking of slightly more esoteric concerns, in recent simulations I've seen some evidence that something like the All Defector from The Fractal Prince, a book in a series recommended in this thread, might be possible.
reply
lxe
28 days ago
[-]
I don't think Culture's philosophy has been the driving factor behind its dominance and military advantage. I think the anarchy utopia is the side effect of the Culture minds being at the developmental peak over other civilizations.
reply
Rzor
28 days ago
[-]
It's explicitly said in the novels (or by Banks, I can't remember exactly) that while civilizations sublime when they reach a certain point of development, the Culture seems hellbent on staying there venturing around the universe.
reply
bloopernova
28 days ago
[-]
I always assumed that Infinite Fun Space was used by the Minds to pre-emptively model any potential conflict.
reply
satori99
28 days ago
[-]
I'm not sure that Infinite Fun Space is used to simulate the real. I got the impression that IFS is where Minds go to unwind and play in synthetic universes with rules of their own imagining. And that humans cannot really even properly conceive them.

The Culture is also very cautious about running high fidelity simulations for ethical reasons. Useful simulations approach the real in fidelity which makes them problematic to delete -- But they still do it.

reply
ethbr1
28 days ago
[-]
But the Minds are something of a turtles-all-the-way-down solution.

If the Culture has Minds, why wouldn't other civilizations?

And why would Culture-esque Minds be superior to less-Culture-y Minds?

reply
ItCouldBeWorse
28 days ago
[-]
Because if other civilizations develop minds- the minds take over- and derive from all worlds the one best outcome- and then join the culture who is already riding the golden path.
reply
ethbr1
28 days ago
[-]
Or all Minds in the Culture are already this sort of Mind.

See underlying textual creepiness about "what exactly are the humans for?"

I.e. an inverted Matrix, with the humans kept in the physical

reply
ItCouldBeWorse
27 days ago
[-]
But there is the layers in between Ian M.Banks wrote about. GroupMinds, people who upload themselves and become more AI like and on and on..

The thing hidden in plane sight is the has and has nots of the culture though. Some have processing power- and some must beg those with processing power to run all the things.

reply
AlotOfReading
28 days ago
[-]
Other civilizations do have minds, constructed differently. The Gzilt in Hydrogen Sonata have minds constructed by humans after they've passed, with personalities from those people.
reply
Vecr
28 days ago
[-]
1) Not humans, Banks just calls them that in the text of the books, and 2) Any mind derived from a "human like" (even to a very small degree, really human like civilization and evolved) is at a massive disadvantage to a very highly optimized result of recursive self improvement.
reply
AlotOfReading
28 days ago
[-]
Actual earth humans are in the books and noted to be of the same general body plan. Close enough.

The Gzilt minds are specifically compared to culture minds and deemed to be comparable in capabilities.

reply
Vecr
28 days ago
[-]
> The Gzilt minds are specifically compared to culture minds and deemed to be comparable in capabilities.

Well, that's not really realistic. Maybe they are only pretending to be based on organics, but really aren't, and just put up a facade.

reply
AlotOfReading
28 days ago
[-]
They're not based on organics, but that's missing the larger point that realism isn't the point of sci fi like the culture series. It's an imaginary setting the author is using to look at society.
reply
swayvil
28 days ago
[-]
I figured that it was simply a flavor of fun that only Minds can appreciate (tho it could have practical uses too of course).
reply
mattmanser
28 days ago
[-]
He's explicit about that in Excession, and other books.
reply
Vecr
28 days ago
[-]
How does he deal with the trillions of people tortured in infinite fun space? Wars don't tend to be suffering free, especially when they are the really nasty worst-case scenario ones you want to simulate. Did he reject the substrate invariance argument? It sounds like that, but if you want the culture to be our future (as in the future of actual humans), you can't do that, because... It's not true.
reply
mattmanser
28 days ago
[-]
He hasn't even read Excession! To me it is the pinnacle of the Culture novels.

It mixes the semi-absurdity and silliness of the absurdly powerful minds (AI in control of a ship), individual 'humans' in a post-scarcity civilization, and the deadly seriousness of games of galactic civilizations.

It also has an absolutely great sequence of the minds having an online conversation.

I do agree with his consider phelebas hesitancy. I still enjoy it, but it is clearly his early ideas and he's still sounding out his literary sci-fi tone and what the culture is. And you can skip the section where the protagonist gets trapped on an island with a cannibal. I think it was influenced by the sort of JG Ballard horror from the same period, and doesn't really work. He never really does something like that again in any of the culture books.

reply
simpaticoder
28 days ago
[-]
Agreed about being able to skip the island sequence in _Consider Phlebas_. I recently reread the book after many years, and in my memory that section looms large, and I expected it to be 100 pages. But it's ~20. It was much easier the second time around, and I think it serves to underscore how committed the Culture is to personal agency, to the extent that if citizens wish to give themselves over to an absurdly evil charismatic leader, no-one will stop them. There was also something interesting about the mind on the shuttle on standby, its almost toddler-like character, told "not to look" at the goings on on the island by the orbital. And its innocent, trusting self is eventually murdered by Horza during the escape, adding some black dark pigment to Horza's already complex character hue.
reply
Vecr
28 days ago
[-]
Personal agency as long as you're fine with the mind control, the resources used are minimal, and you don't interfere with what the minds want. No personal agency to be found over the more broad course of the future, however.
reply
simpaticoder
28 days ago
[-]
>as you're fine with the mind control

It does seem a bit silly to argue about the particulars of a fantasy utopia. Banks posits the conceit that super AI Minds will be benevolent, and of course this need not be the case (plenty of counter-examples in SF, one of my favorites being Greg Benford's Galactic Center series). But note that within the Culture, mind reading (let alone mind control) without permission will get a Mind immediately ostracized from society, one of the few things that gets this treatment. For example the Grey Area uses such a power to suss out genocidal guilt, is treated with extreme disdain by other Minds. See https://theculture.fandom.com/wiki/Grey_Area

As for "personal agency over the broad course of the future", note that the vast majority of humans don't have that, and will never have that, with or without Minds. If one can have benevolent AI gods at the cost of the very few egos affected, on utilitarian grounds that is an acceptable trade-off.

On a personal note, I think the relationship between people and AIs will be far more complex than just all good (Culture) or all bad (Skynet). In fact, I expect the reality to be a combination with absurd atrocities that would make Terry Pratchett giggle in his grave.

reply
Vecr
28 days ago
[-]
>as you're fine with the mind control

The mind control is in the design of the language and the constraint the Minds place on the brain configurations and tech of the other characters. Banks is quite subtle about it, but it's pretty clearly there.

reply
davedx
28 days ago
[-]
I love Consider Phlebas, it’s a right old romp, the pace always pulls me right in.
reply
throwaway55340
28 days ago
[-]
Surface Detail or Use of Weapons qualifies. Although Use of Weapons was written much earlier than released, IIRC.
reply
mattmanser
27 days ago
[-]
Use of Weapons is great. I struggled a bit with Surface Detail and found it a bit meandering, like a few of the later books. It was good, but I definitely rate others above it.

Excession, Player of Games and Use of Weapons are probably my top Culture novels. With Feersum Endjinn up there in his general SF work, though I must admit I've only read it once as I don't have a copy.

Probably far too late for this thread, but lots of his non-SF work is amazing too. Though often dark and dealing with uncomfortable themes. The plots and themes of Song Of Stones + The Wasp Factory have popped into my head for decades after I read them.

reply
gradschoolfail
28 days ago
[-]
If you find Iain M’s characterization of his own work reliable, here’s an interview. My own question is, do humans not feel this need to be useful as strongly, as pervasively??

http://strangehorizons.com/non-fiction/articles/a-few-questi...

In the "Reasons: the Culture" section of the appendices in Consider Phlebas there's the line, "The only desire the Culture could not fulfil from within itself. . . was the urge not to feel useless." In that need alone it is not self-sufficient, and so it has to go out into the rest of the galaxy to prove to itself, and to others, that its own high opinion of itself is somehow justified. At its worst, it is the equivalent of the lady of the manor going out amongst the peasants of the local village with her bounteous basket of breads and sweetmeats, but it's still better than nothing. And while the lady might—through her husband and the economic control he exerts over his estate and therefore the village—might be partly responsible for the destitution she seeks, piecemeal, to alleviate, the Culture isn't. It's just trying terribly hard to be helpful and nice, in situations it did nothing to bring into being.

reply
alexwasserman
28 days ago
[-]
Whenever I’m asked the sort of generic icebreaker questions like “what fictional thing do you wish you had” a neural lace is one of my first answers, short of membership in the Culture or access to a GSV or a Mind.

I also love Consider Phlebas. Maybe because it was the first I read, but I’ve found it to be a great comfort read. Look to Windward and Player Of Games next. Use Of Weapons is always fantastic, but less fun.

His non-sci-fi fiction is great too. I loved Complicity and have read it many time. His whisky book is fantastic.

reply
globular-toast
28 days ago
[-]
I like Consider Phlebas too. I'm not sure why so many say they don't like it. Player of Games was not one of my favourites, but I might read it again at some point to see what I missed. I actually really liked Inversions despite being generally the least well regarded Culture book.
reply
seafoamteal
28 days ago
[-]
I think just yesterday I saw a post on HN about what people in the past the future (i.e. today) would look like, and how wildly wrong a decent proportion of those predictions are. The problem is that we generally tend to extrapolate into the future by taking what we have now and sublimating it to a higher level. Unfortunately, not only is that sometimes difficult, but we also make completely novel discoveries and take unforeseen paths quite often. We need more people with 'muscular' imaginations, as Sloan puts it, to throw out seemingly improbable ideas into the world for others to take inspiration from and build upon.

P.S. Robin Sloan is a wonderful science-fiction and fantasy writer. I was first introduced to him in the excerpts of Cambridge Secondary Checkpoint English exam papers, but only got around to reading his books many years later. I would recommend them to anybody.

reply
GeoAtreides
28 days ago
[-]
>taking what we have now and sublimating to a higher level

That's fine, the Culture is really against sublimating

reply
Vecr
28 days ago
[-]
Going by AI theory Banks failed that hard. I suspect he knew that, due to his tricks with language, but that doesn't mean he successfully predicted a plausible future, even in broad strokes. The singularity is called the singularity for a reason, and even when you throw economists at it you tend to get machine civilizations (though, maybe partially squishy and probably made from carbon instead of silicon) expanding at 1/3rd the speed of light. No culture there.
reply
DanHulton
28 days ago
[-]
I don't think we can confidently say he failed -- the singularity is still just a theory. Being as it hasn't happened, we can't say as to whether the economists or Banks are correct.
reply
robwwilliams
28 days ago
[-]
Thanks for the link to Banks’ site. Great read.

Here is one suggestion that I think surpasses Banks in scope and complexity, and yes, perhaps even with a whiff of optimism about the future:

Hannu Rajaniemi’s Jean Le Flambeur/Quantum Thief Trilogy (2010 to 2014)

https://en.wikipedia.org/wiki/The_Quantum_Thief

https://www.goodreads.com/series/57134-jean-le-flambeur

He manages plots with both great intricacy and with more plot integrity than Banks often manages. And he is much more of a computer and physics geek too, so the ideas are even farther out.

Probably also an HN reader :-)

Also set in a comparative near future. The main “problem” with the Quantum Thief trilogy is the steep learning curve—Rajaniemi throws the reader in the deep end without a float. But I highly recommend perservering!

reply
jauntywundrkind
28 days ago
[-]
I really loved Quantum Thief, with it's Accelerando-like scope & scale of expansion, but mixed with such weird/eccentric/vibrant local mythos of the world, such history.

Without my prompting it ended up in our local sci-fi/fantasy book club's rotation a couple months back, and there was some enjoyment but overall the major mood seemed to be pretty befuddled and confused, somewhat hurt at some of the trauma/bad (which isn't better in book 2!). But man, it worked so well for me. As you say, a very deep end book. But there's so much fun stuff packed in, such vibes; I loved puzzling it through the first time, being there for the ride, and found only more to dig into the next time.

Still feels very different from Banks, where "space hippies with guns" has such a playful side. Quantum Thief is still a story of a solar system, and pressures within it, but there's such amorphous & vast extends covered by the Culture, so many many things happening in that universe. The books get to take us through Special Cirumstances, through such interesting edge cases, where-as the polot of Quantum Thief orbits the existing major powers of the universe.

reply
yencabulator
27 days ago
[-]
The Quantum Thief is the kind of scifi where anything is possible if you call it "quantum" something; not a huge fan.
reply
Vecr
27 days ago
[-]
It does get things wrong ("quantum" is not a magic wand), but in the sense that you treat the quantum stuff as mathematical ideas and not something that would be practical, it's pretty good.

An in, it's the best that I know to exist.

It's not "realistic", because they need to keep AI development confined for story reasons, and I don't think what they do with black holes really makes sense, but most of the individual "objects" or "processes" are pulled directly from physics or computer science papers.

The book's kind of a museum or zoo like experience, because the author needs to keep these things unrealistically separate in order to show them off how he wants to, but it's not generally gibberish. It just wouldn't work in real life.

Edit: or do you mean you personally don't like it because the world isn't solid enough? (generally unchanging, a good base for a story) If so, I agree.

reply
danielodievich
28 days ago
[-]
I am a huge Culture fan. Yesterday at my birthday dinner there were 3 others who are also fans of Banks, one of whom I turned onto the Culture just last year. We were having a great discussion of those books and lamenting the untimely passing of Banks from cancer.

That friend gifted me The Player of Games and Consider Phlebas from esteemed Folio Society (https://www.foliosociety.com/usa/the-player-of-games.html, https://www.foliosociety.com/usa/consider-phlebas.html), gorgeous editions, great paper, lovely bindings, great illustrations. I've been eyeing them for a while and it's so nice to have good friends who notice and are so generous.

reply
throwaway13337
28 days ago
[-]
In these topics, I don't see cyborgs come up much.

We're already kinda cyborgs. We use our tools as extensions of ourselves. Certainly my phone and computer are becoming more and more a part of me.

A chess playing AI and a human beat a chess playing AI alone.

The future I'd like to see is one where we stay in control but make better decisions because of our mental enhancements.

With this logic, the most important thing now is not 'safe AI' but tools which do not manipulate us. Tools should, as a human right, be agents of the owners control alone in the same way that a hand is.

AI isn't separate from us. It's part of us.

Seeing it as separate puts us on a darker path.

reply
yencabulator
27 days ago
[-]
I think the implication is Culture humanoids are all cyborgs. Their technology doesn't follow "a dude with a metal arm" cyberpunk fashion, but they seem all physically well superior to naturally-grown organic humanoids.

https://medium.com/@Fares_Mohamed_Amine/demystifying-neural-...

http://www.technovelgy.com/ct/content.asp?Bnum=2993

http://www.technovelgy.com/ct/content.asp?Bnum=1881

reply
brcmthrowaway
28 days ago
[-]
Transhumanism is just rich people babble
reply
blackhaj7
28 days ago
[-]
I love the culture series.

The worlds that Alastair Reynolds builds in the Revelation Space series grips me the most though.

The conjoiners with their augmented, self healing, interstellar travelling yet still a little human characteristics is both believable but beyond the familiar all at the same time. Highly recommended

reply
ImaCake
28 days ago
[-]
I think Revelation Space does a great job creating a universe that squeezes out novel human cultures through the crushing vice of selection pressure. It’s every bit as daring as The Culture, just a different vibe!
reply
LeroyRaz
28 days ago
[-]
It's decent. I just find Alastair Reynolds a bad observer of humans though... He paints jarringly bad portaits of humans, e.g., in one of his books he has the mercenaries hand wringing about how they can't be morally shady despite the whole fate of the human race depending on them acting decisively. It's is a really naive depiction of how humans actually act under pressure.

I have similar issues with the Three Body problem.

reply
ethbr1
28 days ago
[-]
That was always my issue with Foundation. I couldn't suspend my disbelief sufficiently that characters were acting as they were written.
reply
asplake
28 days ago
[-]
> I do not like Consider Phlebas

One of my favourites! Excession most of all though. Agree with starting with Player of Games.

reply
hermitcrab
28 days ago
[-]
if you are a fan of Bank's culture books, consider reading his first novel 'the wasp factory'. Very dark and funny, with a huge twist at the end. NB/ Not sci-fi.
reply
howard941
28 days ago
[-]
Iain Banks' Horror, non-SF stuff is great. Like you I enjoyed The Wasp Factory. Also, The Bridge. We lost him far too young.

edit: and if you enjoyed Banks' Horror you'll probably get into Dan Simmons' stuff, another SF (Hyperion) and Horror writer. The Song of Kali was excellent.

reply
squeedles
28 days ago
[-]
The level of discussion in this thread, both pro and con, demonstrates that I have made a grave omission by never reading any of this.

However, the article has one point that I viscerally reacted to:

“we have been, at this point, amply cautioned.

Vision, on the other hand: I can’t get enough.”

Amen.

reply
andrewstuart
28 days ago
[-]
I loved reading the books but then discovered the audiobooks.

The audiobooks are absolutely the best way to enjoy Iain M Banks.

The Algebraist read by Anton Lesser one of the best audiobooks ever made.

Equal best with Excession read by Peter Kenny.

These two narrators are incredibly good actors.

I could never go back to the books after hearing these audiobooks.

reply
golol
28 days ago
[-]
For me my favorite Culture novels are the ones which are just vessels to deliver the perfect Deux Ex Machina - Player of Games, Excession, Surface Detail etc.
reply
hypertexthero
28 days ago
[-]
Is there a video game that is particularly complementary to one of the Culture novels?
reply
Vecr
28 days ago
[-]
It's probably not possible to really do it. You could play something like Swat 4 while role-playing an one of the Special Circumstances agents I guess? You can switch between the views of each member of your team, use a fiber optic device to look under doors, sometimes have snipers set up, and look through their scopes/take shots.

Lots of what we see "on screen" (but obviously it's a book) is really a somewhat detail-less alien action book dressed up as 1990s Earth (that's why the characters are called humans, and why certain things are described inaccurately).

A good entry in an action game series started in the 90s isn't a bad bet. As I said, it's close to what good parts of the books portray themselves as anyway.

reply
ThrowawayR2
28 days ago
[-]
Play SimCity or Civilization and contemplate what your relationship is to each of the sims or population units digitally represented in the game is. That would be the difference in level of consciousness between the Minds and ordinary humans like you and I. You are making those crude representations of beings in game happy, protecting them from outside threats, and generally looking out for their well being but your ability to relate to them or converse with them as equals with them will never be any different.
reply
teamonkey
28 days ago
[-]
Can’t think of one. Maybe something like Commander Blood - high concept space opera, focus on alien interactions, humourous & quirky. But it’s also nothing like it.
reply
desertrider12
27 days ago
[-]
If you like puzzles, look at The Talos Principle 2. It's mainly about asking whether humanity should even try to control the physical universe and spread life, or focus on sustainability and safety. The Culture itself referenced a couple of times in it.
reply
HelloMcFly
28 days ago
[-]
Interesting question. Stellaris is the clear choice if you're into grand strategy, you can be an AI-led multi-cultural civilization (starting or become one) with Culture-like win conditions. Endless Space II is another option that's turn-based and a little more approachable.

Mass Effect has a lot of fun with a galactic society. If you like the crew-based action of Consider Phlebas and want more of a story then that or (believe it or not) the Guardians of the Galaxy game that came out a couple of years ago are good choices.

Deus Ex and Cyberpunk 2077 tackles trans-humanism to an extent but they're Earth-bound. Primordia is a point-and-click adventure where humans are extinct and our creations live on. I could probably think of some more but the dinner alarm is going off!

reply
Animats
28 days ago
[-]
What kind of culture would strong AIs put together? That's the real question. Would it need, or involve, humans?

Maybe most of space belongs to robots, and humans are left with the planets in the habitable zone.

reply
weregiraffe
28 days ago
[-]
Try The Noon Universe books by the Strugatsky brothers instead.
reply
api
28 days ago
[-]
I might have to try Player of Games. I didn't like Consider Phlebas either.
reply
alkyon
28 days ago
[-]
I also failed to appreciate Consider Phlebas, on the other hand I liked Wasp Factory very much.

Well, let us see Player of Games.

reply
swayvil
28 days ago
[-]

  Let's play Maximum Happy Imagination.

  I'll start small.

  Flying cars.
reply
worik
28 days ago
[-]
The culture was a dystopia.
reply
LeroyRaz
28 days ago
[-]
I think it is pretty clearly a utopia. It's a post scarcity society, where humans still live meaningful lives (despite being outclassed completely by AI). It's about as best as one can do in a post scarcity society with super intelligence.

The interesting and unique aspect of the series, is that Banks manages to write interesting stories about a utopia.

reply
Ekaros
28 days ago
[-]
I wonder could you even write post scarcity utopia that was actually utopia... Culture might remove lot of freedom, but also it also does allow comfortable living for those who don't care.
reply
Barrin92
28 days ago
[-]
It's not a dystopia, which is a maximally negative state, but I also always found it pretty comical to call it utopian.

A decent chunk of the stories has the reader follow around Special Circumstances, which is effectively a sort of space CIA interfering in the affairs of other cultures. The entire plot of Player of Games, spoiler alert for people who haven't read it, is that both the protagonist of the story, as well as the reader by the narrator, have been mislead and used as a pawn to facilitate the overthrow of the government of another civilization which afterwards collapses into chaos.

To me you can straight up read most of the books as satire on say, a Fukuyama-esque America of the late 20th century rather than a futuristic utopia.

reply
worik
28 days ago
[-]
> It's not a dystopia, which is a maximally negative state, but I also always found it pretty comical to call it utopian.

Well, better, put

reply
pavel_lishin
28 days ago
[-]
How so?
reply
GeoAtreides
28 days ago
[-]
The Culture makes really obvious there's no real purpose, no great struggle, no sense to the universe. When everything is provided, when everything is safe, when there is no more effort, then what's the purpose of life?

A lot of people when confronted with these revelations/questions have an existentialist crisis. For some, the solution is to deny the Culture.

Long story short, the Culture is a true Utopia and some people just can't handle utopias

reply
worik
28 days ago
[-]
It is interesting to read the books in the order they were written.

The earlier novels (I am going completely from memory here) "Use of Weapons" and "Consider Phlebis"

The humans are pets of the machines.

The machines are vicious, and righteous. They use extreme, and creative, violence when they feel it is in their interests.

In the later novels I feel Banks lost his way a bit, got too attached to the characters. In the later Culture novels the main characters are invulnerable, he does not let any of them die. Whereas in consider Phlebus (spoiler alert) Horza dies at the end and his whole race is genocided.

reply
mrlonglong
28 days ago
[-]
Elon Musk liked these books and look at what happened to him since, he's gone far right and all swivel eyed on twitter.
reply
jrmg
28 days ago
[-]
When I read Surface Detail a few years ago, I swear something Musk tweeted near that time made me think he had also read it and identified with Joiler Veppers, the antagonist.
reply
mrlonglong
28 days ago
[-]
Yes, I've heard that said elsewhere, and I think it's perfect.
reply
sidibe
28 days ago
[-]
I've always suspected he never actually read them, everything about the Culture seems to fly in the face of his hard working, sleep on the factory floor, great man proganist values, though I'm sure he'd admire the tech and it would give him ideas for what to promise next. Can't remember which novel but in one of them it was made clear most of the population was trans
reply
pwdisswordfishd
27 days ago
[-]
> Can't remember which novel but in one of them it was made clear most of the population was trans

Yes but that was a population that could actually completely change sex in terms of biological function, using futuristic technology to do so.

So not really the same as the contemporary idea of "trans" which is just cosmetic changes to approximately mimic the opposite sex.

reply
teamonkey
28 days ago
[-]
In Excession, one of the characters is considered odd because he's remained male over his long life and hasn't bothered to mix things up a bit. In one of the later books there was a character who wasn't content with that and tried a number of ever-more alien bodies until they found one that gave them the relief they needed.
reply
mrlonglong
28 days ago
[-]
His daughter disowned him for being an arsehole. Which makes me think he never really understood the point of the books.
reply
yew
28 days ago
[-]
Banks had very "fast cars, chicks, and drugs" tastes - a "bro" if you will - and much of his work is basically James Bond stories. I'm not sure the fans are surprising.
reply
mrlonglong
28 days ago
[-]
Did he really? That's not what I heard.
reply
yew
28 days ago
[-]
He collected the cars and wrote about the (non-fictional) drugs. He wasn't an exhibitionist, as far as I know, so you'll have to infer what you like about the middle one from his writing.

Some related reading:

https://www.theguardian.com/books/1997/may/20/fiction.scienc...

https://www.scotsman.com/news/interview-iain-banks-a-merger-...

https://www.vice.com/en/article/iain-banks-274-v16n12/

reply
swayvil
28 days ago
[-]
I'm sure your space program is way better than Elon's.
reply
LeroyRaz
28 days ago
[-]
Lol. The culture is pretty far left (strong super-paternalistic government, equality of outcomes, etc...). I don't think one can draw a serious connection between liking the series and rightwing politics.
reply
mrlonglong
28 days ago
[-]
Not what I meant.
reply
minedwiz
28 days ago
[-]
L
reply