With George Noory
Live Nightly 1am - 5am EST / 10pm - 2am PST
Eliezer Yudkowsky - Guests

Coast Insider

Not a member? Become a Coast Insider and listen to the show 24/7
Advertisement
Advertisement

Last Show Recap

A remarkable discovery has emerged in astrophysics: key properties of the universe have just the right values to make life possible. Most scientists prefer to explain away this uniqueness, insisting that a number of unseen universes must therefore exist, each randomly different. Astrophysicist Bernard Haisch joined George Knapp in the first half of the show to propose the alternative—that the special properties of our universe reflect an underlying intelligent consciousness.

In the second half of the program, veteran journalist Chris Taylor talked about how the Star Wars franchise has conquered our culture with a sense of lightness and exuberance, while remaining serious enough to influence politics, and spread a spirituality that appeals to religious groups and atheists alike.

Upcoming Shows

Mon 03-30  Entity Encounters Tue 03-31  GMO Fraud Wed 04-01  ET Manipulation Thu 04-02  China's Wealth/ Food Independence Fri 04-03  TBA/ Open Lines

CoastZone

Sign up for our free CoastZone e-newsletter to receive exclusive daily articles.

Eliezer Yudkowsky

Special Guest

Biography:

Eliezer Yudkowsky is a cofounder and research fellow at the Singularity Institute for Artificial Intelligence, an institute for the study of safe advanced artificial intelligence. He is one of the world's foremost researchers on Friendly AI and recursive self-improvement. He is chiefly known for pioneering the study of Friendly AI, which emphasizes the importance of the structure of an ethical optimization process and its supergoal, in contrast to the common trend of seeking the right fixed enumeration of ethical rules that a moral agent should follow. In 2001, he published the first technical analysis of motivationally stable goal systems, with his book-length Creating Friendly AI: The Analysis and Design of Benevolent Goal Architectures. In 2002, he wrote "Levels of Organization in General Intelligence," a paper on the evolutionary psychology of human general intelligence, published in the edited volume Artificial General Intelligence. He also has two papers forthcoming in the edited volume Global Catastrophic Risks, entitled "Cognitive Biases Potentially Affecting Judgment of Global Risks" and "Artificial Intelligence as a Positive and Negative Factor in Global Risk."

Websites:

Past Shows:

Artificial Intelligence & The Singularity

Self-described autodidact and co-founder of the Singularity Institute for Artificial Intelligence, Eliezer Yudkowsky discussed the technological creation of the first smarter-than-human intelligence - what he calls the Singularity. ... More »

Host: Ian Punnett
Advertisement