With George Noory
Live Nightly 1am - 5am EST / 10pm - 2am PST
Eliezer Yudkowsky - Guests

Coast Insider

Not a member? Become a Coast Insider and listen to the show 24/7
Advertisement

Coast Insider

Not a member? Become a Coast Insider and listen to the show 24/7
Advertisement

Last Show Recap

In the first half, cyber technology expert Charles R. Smith offered analysis of threats from North Korea, and reports about security flaws, hacks, and viruses.

In the latter half, an 11th generation Creole New Orleanian, Bloody Mary, made her debut on the show, discussing the rich history of voodoo and the paranormal that permeates the culture of New Orleans, and her interactions with the spirit realm.

Upcoming Shows

Fri 02-12  The Ninth Planet/ Open Lines Sat 02-13  Mojave Alien Abduction Sun 02-14  Bank of Canada Controversy/ Zika Virus Mon 02-15  Planetary Change/ Double Earths Tue 02-16  State of Economy/ Open Lines Wed 02-17  TBA
Thu 02-18  Predatory Capitalism/ Dowsing & Clearing Fri 02-19  Strange Creatures & UFO Abductions/ Open Lines

CoastZone

Sign up for our free CoastZone e-newsletter to receive exclusive daily articles.

Eliezer Yudkowsky

Special Guest

Biography:

Eliezer Yudkowsky is a cofounder and research fellow at the Singularity Institute for Artificial Intelligence, an institute for the study of safe advanced artificial intelligence. He is one of the world's foremost researchers on Friendly AI and recursive self-improvement. He is chiefly known for pioneering the study of Friendly AI, which emphasizes the importance of the structure of an ethical optimization process and its supergoal, in contrast to the common trend of seeking the right fixed enumeration of ethical rules that a moral agent should follow. In 2001, he published the first technical analysis of motivationally stable goal systems, with his book-length Creating Friendly AI: The Analysis and Design of Benevolent Goal Architectures. In 2002, he wrote "Levels of Organization in General Intelligence," a paper on the evolutionary psychology of human general intelligence, published in the edited volume Artificial General Intelligence. He also has two papers forthcoming in the edited volume Global Catastrophic Risks, entitled "Cognitive Biases Potentially Affecting Judgment of Global Risks" and "Artificial Intelligence as a Positive and Negative Factor in Global Risk."

Websites:

Past Shows:

Artificial Intelligence & The Singularity

Self-described autodidact and co-founder of the Singularity Institute for Artificial Intelligence, Eliezer Yudkowsky discussed the technological creation of the first smarter-than-human intelligence - what he calls the Singularity. ... More »

Host: Ian Punnett
Advertisement