🧵 View Thread
🧵 Thread (10 tweets)

AI alignment is hard [in part] because of goodhart’s law, ecologies, parenting, and culture parenting and culture help raise a young natural intelligence (baby human) and integrate them into the larger ecology. we don’t tell a baby, "go optimise this function!"

letting AIs "loose" on various environments, shaping their rewards for social approval, and generally seeing them as participating in a live ecology that matures with time seems safer than single-shot, "align this AI"

…or actors that burn entire villages out of an unmet need of being loved we could call it failure to be integrated into society. failure to contribute to society in a way we consider healthy from within the ecology https://t.co/QdU5YnqPdT

it’s significant that humans are raised in an ecology—a continual, uninterrupted ecology, where they’re allowed to roam free a little (kids have some restrictions, but are boisterous), and slowly get shaped into being more integrated… https://t.co/eeWjnU0tvN

we literally imprint society’s values on onto kids, and then they propagate them indeed, this is what OpenAI and are attempting to do: raise AI in generations. but raising generations and building up an ecology takes time. it takes time to let the dust settle

now we have something that can introduce big shocks, it can be totally unintegrated. the ultimate, killer invasive species. it could outcompete and wipe out everything else, including the things it depended on to survive. this happens all the time in ecologies

anyone who’s tended to a farm or forest knows what a sudden shock can do. even if you bring the same beings back, it takes time for them to work together again https://t.co/1uIrRSYJkE

i don’t really care about spread AI doom or AI foom. but i do care about having language for the realities we live in, and the truth is: humans have always lived in ecology with their environment, now there’s a new powerful actor, here’s how im able to reason about it https://t.co/KcfVY3ZUbg