Technology is central to how the future will unfold—that’s undoubtedly true—but technology is not the point of the future, or what’s really at stake. We are.
lk 285
Technology is central to how the future will unfold—that’s undoubtedly true—but technology is not the point of the future, or what’s really at stake. We are.
lk 285
The coming wave is going to change the world. Ultimately, human beings may no longer be the primary planetary drivers, as we have become accustomed to being. We are going to live in an epoch when the majority of our daily interactions are not with other people but with AIs. This might sound intriguing or horrifying or absurd, but it is happening.
lk 284
Intelligence, life, raw power—these are not playthings and should be treated with the respect, care and control they deserve.
lk 277
Safety relies on things not failing, not getting into the wrong hands, forever.
lk 277
…ask not just what doing no harm means in an age of globe-spanning algorithms and edited genomes but how that can be enacted daily in what are often morally ambiguous circumstances.
lk 270
The more general a model, the more likely it is to pose a serious threat.
lk 261
Accountability is enabled by deep understanding. Ownership gives control.
lk 259
What I cannot create, I do not understand.
Richard Freynman
lk 259
Credible critics must be practitioners.
lk 253
One of the issues with LLMs is that they still suffer from the hallucination problem, whereby they often confidently claim widely wrong information as accurate. This is doubly dangerous given they often are right, to an excerpt level. As a user, it’s all too easy to be lulled into a false sense of security and assume anything coming out of the system is true.
lk 243
The central problem for humanity in the twenty-first century is how we can nurture sufficient legitimate political power and wisdom, adequate technical mastery, and robust norms to constrain technologies to ensure they continue to do far more good than harm. How in other words, we can contain the seemingly uncontainable.
lk 228
Exponential change is coming. It is inevitable. The fact needs to be addressed.
lk 255
Make no mistake: standstill in itself spells disaster.
lk 220
That’s part of the problem; we don’t know what failure models are being introduced and how deep they could extend.
lk 211
AI is both valuable and dangerous precisely because it’s an extension of our best and worst selves.
lk 210