With George Noory
Live Nightly 1am - 5am EST / 10pm - 2am PST
Eliezer Yudkowsky - Guests

Coast Insider

Not a member? Become a Coast Insider and listen to the show 24/7
Advertisement

Coast Insider

Not a member? Become a Coast Insider and listen to the show 24/7
Advertisement

Last Show Recap

In the first half, scientist at the Institute of Noetic Sciences, Dean Radin, talked about the Global Consciousness Project (GCP), as well as his scientific experiments studying psychic abilities and other 'supernormal' characteristics.

In the latter half, preparedness expert Mat Stein discussed the danger of solar flares, the ongoing debacle of Fukushima, and the militarization and increasing violence in America's civilian police force and the apparent targeting of dissidents in the USA who threaten the great corporatocracy of America.

Upcoming Shows

Wed 08-05  Money Mafia & ETs Thu 08-06  Tarot & Magick Fri 08-07  TBA/ Open Lines

CoastZone

Sign up for our free CoastZone e-newsletter to receive exclusive daily articles.

Eliezer Yudkowsky

Special Guest

Biography:

Eliezer Yudkowsky is a cofounder and research fellow at the Singularity Institute for Artificial Intelligence, an institute for the study of safe advanced artificial intelligence. He is one of the world's foremost researchers on Friendly AI and recursive self-improvement. He is chiefly known for pioneering the study of Friendly AI, which emphasizes the importance of the structure of an ethical optimization process and its supergoal, in contrast to the common trend of seeking the right fixed enumeration of ethical rules that a moral agent should follow. In 2001, he published the first technical analysis of motivationally stable goal systems, with his book-length Creating Friendly AI: The Analysis and Design of Benevolent Goal Architectures. In 2002, he wrote "Levels of Organization in General Intelligence," a paper on the evolutionary psychology of human general intelligence, published in the edited volume Artificial General Intelligence. He also has two papers forthcoming in the edited volume Global Catastrophic Risks, entitled "Cognitive Biases Potentially Affecting Judgment of Global Risks" and "Artificial Intelligence as a Positive and Negative Factor in Global Risk."

Websites:

Past Shows:

Artificial Intelligence & The Singularity

Self-described autodidact and co-founder of the Singularity Institute for Artificial Intelligence, Eliezer Yudkowsky discussed the technological creation of the first smarter-than-human intelligence - what he calls the Singularity. ... More »

Host: Ian Punnett
Advertisement